midterm review ch 7-8. requests for help by chapter

29
Midterm Review Ch 7-8

Upload: thomasine-mcbride

Post on 17-Jan-2016

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Midterm Review Ch 7-8. Requests for Help by Chapter

Midterm Review Ch 7-8

Page 2: Midterm Review Ch 7-8. Requests for Help by Chapter

Requests for Help by Chapter

0102030405060708090

100

1 2 3 4 5 6 7 8

Chapters

Page 3: Midterm Review Ch 7-8. Requests for Help by Chapter

Chapter 7

Describe the characteristics of the relationship between two variables.

Discuss the null and research hypotheses for correlation.

Discuss using the r table for determining significance.

Page 4: Midterm Review Ch 7-8. Requests for Help by Chapter

Chapter 8

Discuss the steps in making raw data predictions from raw data values.

Discuss the situations when you cannot use regression.

Discuss the inappropriateness of predicting outside of the sample range.

Discuss the null hypothesis in regression. Discuss alpha levels and critical values with respect

to statistical significance. Discuss residual error when predicting from

regression.

Page 5: Midterm Review Ch 7-8. Requests for Help by Chapter

Describe the characteristics of the relationship between two variables.

Page 6: Midterm Review Ch 7-8. Requests for Help by Chapter

Describe the characteristics of the relationship between two variables

Three dimensions characterize the relationship between two variables,

linearity,

direction,

and strength.

Page 7: Midterm Review Ch 7-8. Requests for Help by Chapter

Linearity

The relationship is either linear or some other curvilinear relationship.

In a linear relationship, as scores on one variable increase, scores on the other variable either generally increase or generally decrease.

In a curvilinear relationship, as scores on one variable increase, scores on the other variable move first in one direction, then in another direction.

Page 8: Midterm Review Ch 7-8. Requests for Help by Chapter

Direction

The direction of a relationship is either positive or negative.

In a positive relationship, as scores on one variable increase, scores on the other variable increase. So a best fitting line rises from left to right on a graph and therefore has a positive slope.

In a negative relationship, as scores on one variable increase, scores on the other variable decrease. So a best fitting line falls from left to right on a graph and therefore has a negative slope.

Page 9: Midterm Review Ch 7-8. Requests for Help by Chapter

Strength The strength of a correlation indicates how

predictable one variable is from another. In a strong relationship, tX and tY scores are

consistently similar or dissimilar. So, you are able to accurately predict one score from another.

In a weak relationship, tX and tY scores are in consistent in similarity or dissimilarity. So, you are only able to somewhat predict one score from another.

In an independent relationship, there is no consistency in the relationship of the tX and tY scores So, it is impossible to predict one score from another.

Page 10: Midterm Review Ch 7-8. Requests for Help by Chapter

Discuss alpha levels and critical values with respect to statistical significance.Discuss the null and research hypotheses for correlation.

Discuss the null hypothesis in regression.

Discuss using the r table for determining significance.

Page 11: Midterm Review Ch 7-8. Requests for Help by Chapter

Alpha levels and significance

Scientists are a careful bunch. They are very careful to not make a Type 1 error.

A Type 1 error is when you mistakenly say that you have found a relationship in a population, when the one you found in your random sample doesn’t exist in the population as a whole

To be careful, scientists will only say there is a relationship, if the probability of a Type 1 error is very low (5 in 100 or lower.)

Page 12: Midterm Review Ch 7-8. Requests for Help by Chapter

. . . Alpha levels and significance

These probabilities are called alpha levels.

The typical alpha levels are p.05 or p.01.

These represent 5 out of 100 or 1 out of 100.

A sample r that is far enough from 0.000 to occur 5 or fewer times in 100 when rho actually equals zero is called significant.

Page 13: Midterm Review Ch 7-8. Requests for Help by Chapter

The Null Hypothesis

The null hypothesis (H0) states that a non-zero correlation in a sample between two variables is the result of random sampling fluctuation.

Therefore, there is no underlying relationship in the population as a whole.

In mathematical terms, rho = 0.000.

Page 14: Midterm Review Ch 7-8. Requests for Help by Chapter

The Alternative Hypothesis

The alternative hypothesis is the opposite of the null hypothesis.

It states that there is an underlying relationship in the population as a whole and that it is reflected by a non-zero correlation in your random sample.

Page 15: Midterm Review Ch 7-8. Requests for Help by Chapter

Rejecting the Null Hypothesis

The purpose of research is to reject the null hypothesis.

We reject the null hypothesis, when the correlation is significant.

The correlation is significant, when the probability that the result is due to an error is less than the .05 or .01 alpha level.

Page 16: Midterm Review Ch 7-8. Requests for Help by Chapter

Using the r Table to Determine Significance

First, calculate r.

Then, determine the degrees of freedom, (np-2).

Look in the r table to see if r falls outside the CI.95 in Column 2 of the r table. If r does, it is significant.

Page 17: Midterm Review Ch 7-8. Requests for Help by Chapter

123456789

101112

.

.

.100200300500

10002000

10000

-.996 to .996-.949 to .949-.877 to .877-.810 to .810-.753 to .753-.706 to .706-.665 to .665-.631 to .631-.601 to .601-.575 to .575-.552 to .552-.531 to .531

.

.

.-.194 to .194-.137 to .137-.112 to .112-.087 to .087-.061 to .061-.043 to .043-.019 to .019

.997

.950

.878

.811

.754

.707

.666

.632

.602

.576

.553

.532...

.195

.138

.113

.088

.062

.044

.020

.9999.990.959.917.874.834.798.765.735.708.684.661

.

.

..254.181.148.115.081.058.026

df nonsignificant .05 .01

Look in this columnfor the row that has your

degrees of freedom.

Does rfall here?

Or here?

Or here?

Non-significantSignificant

Page 18: Midterm Review Ch 7-8. Requests for Help by Chapter

Significant Correlation

If r is non-significant, we continue to accept the null hypothesis and say that rho = 0.000 in the population.

If r is significant, we reject the null hypothesis at the .05 or .01 alpha level. We assume that rho is best estimated by r, the correlation found in the random sample.

Page 19: Midterm Review Ch 7-8. Requests for Help by Chapter

Discuss the situations when you cannot use regression.

Discuss the inappropriateness of predicting outside of the sample range.

Page 20: Midterm Review Ch 7-8. Requests for Help by Chapter

Use Regression CarefullyWhen we have the entire population and

compute rho, We know all of the values of X and Y. We know the direction and strength of the

relationship between X and Y variables.

Therefore, we can safely use the regression equation to predict Y from X.

Even when rho is exactly zero, the regression equation is still right. It tells us to predict that everyone will score at the mean of Y.

Page 21: Midterm Review Ch 7-8. Requests for Help by Chapter

Samples and RegressionWhen we have a random sample from a

population, we can only predict when r is significant, otherwise we assume that rho is 0,

the relationship between the variables is linear, otherwise it is inappropriate to use correlation at all,

the X score is within the range of X scores in the sample, because for values outside of the range, you do not know if the linear relationship holds.

Page 22: Midterm Review Ch 7-8. Requests for Help by Chapter

Discuss the steps in making raw data predictions from raw data values.

Page 23: Midterm Review Ch 7-8. Requests for Help by Chapter

Describe the steps in making predictions from raw data.

Scientists are interested in taking the score for one variable and then predicting the score for another variable.

If you want to predict, you must first ensure that there is a linear relationship between the two variables. Then, you must calculate the correlation coefficient and check that it is significant. You also must check that the score you are predicting from is within the original range of scores.

If these conditions are met, then you can use the regression equation to predict. You first convert the predicting score to a t score. Then you plug the t score and the correlation value into the regression equation. You solve the regression equation for the predicted t score. Finally, you convert the predicted t score into the predicted score.

Page 24: Midterm Review Ch 7-8. Requests for Help by Chapter

Discuss residual error when predicting from regression.

Page 25: Midterm Review Ch 7-8. Requests for Help by Chapter

Discuss residual error when predicting from regression.

The average squared error when we predict from the mean is the variance, also called the mean square error.

The average squared error when we predict from the regression equation is called the residual mean square.

Page 26: Midterm Review Ch 7-8. Requests for Help by Chapter

Residual Square Error

A significant correlation will always yield a better prediction than the mean.

Therefore, the residual mean square is always better, that is, smaller than the variance.

Page 27: Midterm Review Ch 7-8. Requests for Help by Chapter

Steps in calculating Residual Square Error

To calculate the variance take the deviations of Y from the mean of Y, square them, add them up, and divide by degrees of freedom.

To calculate the residual mean square take the deviations of each Y from its predicted value, square them, add them up, and divide by degrees of freedom.

Page 28: Midterm Review Ch 7-8. Requests for Help by Chapter

Short way to do that

r2 equals the proportion of error that is gotten rid of when you use the regression equation rather than the mean as your prediction.

So the amount of error you get rid of equals the original sum of squares for Y times r2.

So the remaining error, SSRESID, equals the amount of error you get by using the mean as your predictor (SSY) minus the amount you get rid of by using the regression equation, r2SSY

Page 29: Midterm Review Ch 7-8. Requests for Help by Chapter

SSRESID=SSY –r2SSY

To get average squared error when using the regression equation, divide by dfREG

(MSRESID=SSRESID/(np-2)

The standard error of the estimate is simply the square root of MSRESID