In: Statistics and Probability
Suppose the correlation between first year GPA and second year GPA is 0.7. Assuming a close to linear relationship between GPAs in first year and in second year, what is the approximate average z-score in second year of students who had a z-score of 1.5 in their first-year GPA?
What is the average first-year z-score of students who have a z-score of 1.5 in their second-year GPA?
It seems that, on average, students who do well in first year do
better than average in second year but not quite as well as they
did in first year. And students who do better than average in
second year did, on average, better than average in first year but
not quite as well as they did in second year. Whichever way you go,
from first year to second year, or from second year to first year,
it looks like the grades are getting closer to the average.
Is this a contradiction? Is there an explanation for it? If you
need a diagram to help explain it, go ahead a draw one
Problem 1:
Let x=First year GPA; y=Second year GPA;
Since correlation between x and y is 0.7 which indicates positive direction but not strong in strength and 0.72=0.49 i.e. 49% of total variation in y (x) is explained by regression equation of y (x) on x (y). So 51% is still unexplained. So we only say that students who do well in first year do better than average in second year and students who do better than average in second year did, on average, better than average in first year.