In: Statistics and Probability
(i)
Standard error of the OLS estimate is given as,
where
is the standard error of the regression
is the total sample variation in , and
is the R-squared from regressing on all other independent variables (and including an intercept).
(ii)
is the R-squared from regressing on all other independent variables (and including an intercept). Because the R-squared measures goodness-of-fit, a value of close to one indicates that explains much of the variation in all other independent variables in the sample. This means that are highly correlated with all other independent variables.
where n are number of observations and k are number of independent variables
and are the OLS residuals.
(iii)
Gauss Markov Theorem - Under Assumptions given below, are the best linear unbiased estimators (BLUEs) of respectively.
Assumptions -
Assumption MLR.1 (Linear in Parameters) - The model in the population can be written as
where are the unknown parameters (constants) of interest and u is an unobservable random error or disturbance term.
Assumption MLR.2 (Random Sampling) - We have a random sample of n observations, , following the population model in Assumption MLR.1.
Assumption MLR.3 (No Perfect Collinearity) - In the sample (and therefore in the population), none of the independent variables is constant, and there are no exact linear relationships among the independent variables.
Assumption MLR.4 (Zero Conditional Mean) - The error u has an expected value of zero given any values of the independent variables. In other words,
Assumption MLR.5 (Homoskedasticity)- The error u has the same variance given any values of the explanatory variables. In other words, .