copyright © 2012 by nelson education limited. chapter 14 partial correlation and multiple...

23
Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Upload: hillary-waters

Post on 06-Jan-2018

226 views

Category:

Documents


3 download

DESCRIPTION

Copyright © 2012 by Nelson Education Limited. Partial correlation and multiple regression and correlation are multivariate techniques for interval-ratio variables (multivariate techniques for nominal- or ordinal-level variables are covered on the textbook's website). –Partial correlation allows us to examine how a bivariate relationship changes when a control variable(s) is introduced. –Multiple regression and correlation assess the effects, separately and in combination, of two or more independent variables on the dependent variable. Partial Correlation and Multiple Regression and Correlation 14-3

TRANSCRIPT

Page 1: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

Chapter 14Partial Correlation and Multiple

Regression and Correlation

14-1

Page 2: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Partial Correlation• Multiple Regression: Assessing the

Effects of the Independent Variables

• Multiple correlation (R2)

In this presentation you will learn about:

14-2

Page 3: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Partial correlation and multiple regression and correlation are multivariate techniques for interval-ratio variables (multivariate techniques for nominal- or ordinal-level variables are covered on the textbook's website).– Partial correlation allows us to examine how a bivariate relationship changes when a

control variable(s) is introduced.– Multiple regression and correlation assess the effects, separately and in combination,

of two or more independent variables on the dependent variable.

Partial Correlation and Multiple Regression and Correlation

14-3

Page 4: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Partial correlation and multiple regression and correlation are an extension of, and based on, the least-squares regression line and Pearson’s r (Chapter 13).

• We will focus our attention here on multiple regression and correlation with two (for ease of explanation) independent variables, though these techniques can be extended to situations involving three or more independent variables.

Partial Correlation and Multiple Regression and Correlation (continued)

14-4

Page 5: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Multiple Regression and Correlation allow us to:– Disentangle and examine the separate effects of the

independent variables. – Use all of the independent variables to predict Y.– Assess the combined effects of the independent variables

on Y.

Multiple Regression and Correlation

14-5

Page 6: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Least-squares multiple regression equation:

Y = a + b1X1 + b2X2

where, a = the Y intercept (Formula 14.6)

b1 = the partial slope of the first independent variable (X1) on Y (Formula 14.4)

b2 = the partial slope of the second independent

variable (X2) on Y (Formula 14.5)

Multiple Regression

14-6

Page 7: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• This least-squares regression equation involves two independent variables, but could be modified to include any number of independent variables.

• The partial slopes:– show the effect of each independent variable (in their original units) on

Y while controlling for the effect of the other independent variable(s).– Partial slopes must be computed before computing a (Y intercept).

Multiple Regression (continued)

14-7

Page 8: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

Multiple Regression (continued)

14-8

Page 9: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• In Chapter 13 we considered the relationship between number of children (X) and husband’s contribution to housework (Y) for 12 dual-earner families. • Here will assess the effects of two independent variables on husband’s contribution to housework: number of children (X1) and years of education completed by the husband (X2).

Multiple Regression: An Example

14-9

Page 10: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Table 14.1 shows the data for these three variables :

Multiple Regression: An Example (continued)

14-10

Page 11: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• All the relevant information, calculated from Table 14.1, needed to determine the multiple regression equation is shown below:

Multiple Regression: An Example (continued)

14-11

Page 12: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

•The zero-order correlations (correlation coefficients calculated for bivariate relationships are often referred to as “zero-order” correlations)indicate that:

ohusband’s contribution to housework is positively related to number of children (ry1=0.50)ohusbands with higher education tend to do less housework (ry2=-0.30)ohusbands with higher education tend to have fewer children (r12 =-0.47)

Multiple Regression: An Example (continued)

14-12

Page 13: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• The partial slope for the first independent variable (number of children), X1 is:

• A slope of .65 means that the amount of time the husband contributes to housekeeping chores increases by .65 hours per week for each additional child in the family, controlling for the effects of education

Multiple Regression: An Example (continued)

14-13

Page 14: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• The partial slope for the second independent variable (education), X2 is:

• A slope of -.07 means that the amount of time the husband contributes to housekeeping chores decreases by .07 hours per week for each additional year of education completed by the husband, controlling for the effects of number of children.

Multiple Regression: An Example (continued)

14-14

Page 15: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• With the partial slopes, we can now find the Y intercept (a):

and the least-squares multiple regression equation:

o As was the case with the bivariate regression line (Chapter 13), this formula can be used to predict scores on the dependent variable from scores on the independent variables.

Multiple Regression: An Example (continued)

14-15

Page 16: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Partial slopes (b1 and b2) are in the original units of the independent variables.

• To compare the relative effects of the independent variables, compute beta-weights (b*).

• Beta-weights show the amount of change in the standardized scores of Y for a one-unit change in the standardized scores of each independent variable, while controlling for the effects of all other independent variables.

Standardized Partial Slopes (beta-weights)

14-16

Page 17: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Use Formula 14.7 to calculate the beta-weight for X1

• Use Formula 14.8 to calculate the beta-weight for X2

Standardized Partial Slopes (beta-weights) (continued)

14-17

Page 18: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Beta-weight for the first independent variable, number of children (X1):

Beta-weights: An Example•Beta-weight for the second independent variable, education (X2):

14-18

Page 19: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• The beta-weights show that number of children (X1) relative to education (X2) has a much larger effect on husband’s contribution to housekeeping chores.

Beta-weights: An Example (continued)

14-19

Page 20: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• The multiple correlation coefficient (R2) shows the combined effects of all independent variables on the dependent variable.

Multiple Correlation (R2)

14-20

Page 21: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• Formula 14.11 allows X1 to explain as much of Y as it can, and then adds in the effect of X2 after X1 is controlled.

• Formula 14.11 eliminates the overlap in the explained variance between X1 and X2.

Multiple Correlation (R2) (continued)

14-21

Page 22: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

Multiple Correlation (R2) (continued)

14-22

Page 23: Copyright © 2012 by Nelson Education Limited. Chapter 14 Partial Correlation and Multiple Regression and Correlation 14-1

Copyright © 2012 by Nelson Education Limited.

• R2 = 0.255• In combination, the two independents explain a

total of 25.5% of the variation in the dependent variable.– Note, since number of children (X1) explains 25%

(r2y1=.25) of the variance by itself, education (X2)

adds just .5% to R2 (.25 +.5= 0.255).

Multiple Correlation (R2) (continued)

14-23