In: Math
1.How do we recognize a correctly stated standardized regression equation?
2.After scores have been standardized, the value of the Y intercept will always be what?
3.What does the coefficient of multiple determination show?
4.Under what condition could the coefficient of multiple determination be lower than the zero order correlation coefficients?
5.What is the coefficient of multiple determination with two independent variables?
(1)
Standardized regression coefficients remove the unit of measurement of predictor and outcome variables. They are sometimes called betas, but I don’t like to use that term because there are too many other, and too many related, concepts that are also called beta.
There are many good reasons to report them:
But there are times you need to use some procedure that won’t compute standardized coefficients for you.
Often it makes more sense to use a general linear model procedure to run regressions. But GLM in SAS and SPSS don’t give standardized coefficients.
Likewise, you won’t get standardized regression coefficients reported after combining results from multiple imputation.
Luckily, there’s a way to get around it.
A standardized coefficient is the same as an unstandardized coefficient between two standardized variables. We often learn to standardize the coefficient itself because that’s the shortcut. But implicitly, it’s the equivalence to the coefficient between standardized variables that gives a standardized coefficient meaning.
So all you have to do to get standardized coefficients is standardize.
(2)
Coefficient of Multiple Determination
The coefficient of multiple determination (R2) measures the proportion of variation in the dependent variable that can be predicted from the set of independent variables in a multiple regression equation. When the regression equation fits the data well, R2 will be large and vice versa.
The coefficient of multiple correlation can be defined in terms of sums of squares:
SSR = Σ ( ŷ - y )2
SSTO = Σ ( y - y )2
R2 = SSR / SSTO
where SSR is the sum of squares due to regression, SSTO is the total sum of squares, ŷ is the predicted value of the dependent variable, y is the dependent variable mean, and y is the dependent variable raw score.
(5)
multiple regression is an extension of simple linear regression in which more than one independent variable (X) is used to predict a single dependent variable (Y). The predicted value of Y is a linear transformation of the X variables such that the sum of squared deviations of the observed and predicted Y is a minimum. The computations are more complex, however, because the interrelationships among all the variables must be taken into account in the weights assigned to the variables. The interpretation of the results of a multiple regression analysis is also more complex for the same reason.
With two independent variables the prediction of Y is expressed by the following equation:
Y'i = b0 + b1X1i + b2X2i
Note that this transformation is similar to the linear transformation of two variables discussed in the previous chapter except that the w's have been replaced with b's and the X'i has been replaced with a Y'i.
The "b" values are called regression weights and are computed in a way that minimizes the sum of squared deviations
in the same manner as in simple linear regression. The difference is that in simple linear regression only two weights, the intercept (b0) and slope (b1), were estimated, while in this case, three weights (b0, b1, and b2) are estimated.
ANSWERED
PLEASE RATE ME POSITIVE THANKS