Consequences of multicollinearity
- Even extreme multicollinearity (so long as it is not perfect)
does not violate OLS assumptions. OLS estimates are still unbiased
and BLUE (Best Linear Unbiased Estimators)
- Nevertheless, the greater the multicollinearity, the greater
the standard errors. Note, however, that large standard errors can
be caused by things besides multicollinearity.
- When two IVs are highly and positively correlated, their slope
coefficient estimators will tend to be highly and negatively
correlated. In other words, if you overestimate the effect of one
parameter, you will tend to underestimate the effect of the other.
Hence, coefficient estimates tend to be very shaky from one sample
to the next.
Consequences of Heteroscedasticity
- The OLS estimators and regression predictions based on them
remains unbiased and consistent.
- The OLS estimators are no longer the BLUE (Best Linear Unbiased
Estimators) because they are no longer efficient, so the regression
predictions will be inefficient too.
- Because of the inconsistency of the covariance matrix of the
estimated regression coefficients, the tests of hypotheses,
(t-test, F-test) are no longer valid.