Multiple Regression

Return to Behavioral Research Methods

This is the statistical method used when you have more than one independent variable (predictor) on an interval or ratio scale and one dependent variable (criterion) on an interval or ratio scale. The purpose of multiple regression is to gain the ability of prediciton and determine which predictor is the best (or strongest).

Having more than one predictor allows us to answer a number of interesting questions:


 * (1) How much total variance in the criterion is explained by the predictors?


 * (2) Which variables are unique predictors of the criterion?


 * (3) Which variable is the best predictor of the criterion?

The multiple regression analysis provides four key pieces of information: the R Value, the R Squared value, Unstandardized Regression Weight (B value or slope), and Standardized Regression Weight (Beta weight).

R Value
The R value shows the strength of the relationship of all the predictors and the criterion.

R Squared
The R squared value shows the amount of variance in the criterion explained by the predictors

Unstandardized Regression Weight
Unstandardized regression weight (B value or slope) for each predictor shows the relationship between that predictor and the criterion. This number is in the units of the predictor so you cannot use this to compare predictors.


 * For example, what if one predictor was height and the other was annual salary? 176cm is much smaller than $40,000.

Standardized Regression Weight
Standardized regression weight (Beta weight) shows which predictor is stronger and if a predictor has a unique relationship with the criterion. This value is standardized so you can compare predictors. If a Beta weight is significant, that means the predictor shares a significant amount of unique variance with the criterion.

Multi-Colinearity
This is when predictors are related to each other.

The Beta weights show the unique relationship between the predictor and the criterion BUT when predictors are correlated, the unique amount of variance predicted by a single predictor may be small.

Multi-Colinearity can also signal that two predictors are actually measuring the same thing.

Multiple Regression in SPSS

 * 1) Click 'Analyze' -> 'Regression' -> 'Linear'
 * 2) In the left hand box, highlight the Y-value, then click the arrow to move it into the 'Dependent Variable' box
 * 3) For the predictor variable, highlight the X-value and click the arrow to move it into the 'Independent Variable' box
 * 4) Repeat step 3 for each predictor variable. Once all of your predictors are in the 'Independent Variable' box, continue to step 5
 * 5) Click the 'Statistics' button
 * 6) Make sure you have a check in the box by 'Colinearity Diagnostics' then click the 'Continue' button
 * 7) Click 'OK'
 * 8) Your output should appear