Question

In: Economics

Under what conditions does the Gauss-Markov Theorem guarantee the OLS estimators to be BLUE? State such...

Under what conditions does the Gauss-Markov Theorem guarantee the OLS estimators to be BLUE? State such conditions and explain each of them in your words. What does it mean for the OLS estimators to be BLUE? Explain

Solutions

Expert Solution

According to Gauss Markov Theorem, under the conditions satisfied by the Classical Linear Regression Model (CLRM), the OLS - Ordinary Least Square Estimators have minimum variance among all the class of estimators. This means, the OLS estimators are BLUE i.e. Best Linear Unbiased estimators.

The conditions of the respective theorem can be stated below using a two variable regression model:

Condition 1: The regression model should be linear in parameters. It is not necessary that the model is linear among the dependent and independent variables. An example of such regression model is:

Yi = B1 + B2Xi + ui       

{Yi : Dependant Variable ; Xi : Independent Variable, B1 and B2 are the parameters, ui : Error Term}

Condition 2: The independent/explanatory variable X is uncorrelated with the error term ui. If the value of Xi is fixed (stochastic in nature) this assumption holds true automatically. Even if Xi is stochastic in nature this assumption can be more or less true, if the sample size is large enough.

Condition 3: The mean or the expected value of error term is zero. The error term can be seen as all the factors which are specifically introduces in the regression model. Since all these factors are unrelated to the Xi (by condition 1), thus given the value of Xi the mean of ui turns out to be zero:

E(u| Xi) = 0

Condition 4: The error term is Homoscedastic, which means the variance of each ui is constant. This statement means, the individual values of the dependent variable Y are spread around their mean values with the constant variance. If this does not holds true, then the regression model experiences the problem of Hetroscedasticity or unequal variance. Thus, homoscedasticity can be algebraically written as below:

Var(ui) = σ 2

Condition 5: Another important condition of CLRM is the condition of No-Autocorrelation, which means there is no systematic relation between two error terms such as ui and uj. Algebraically the condition can be written as:

Cov(ui , uj) = 0

This state that there is neither a positive nor a negative correlation among two errors terms and thus the error terms are random.

Condition 6 : The regression model is free from any specification bias or specification error, and thus the model is correctly specified. This condition makes sure that all the variables that affect a particular phenomenon are included in the model.

Condition 7 : An added assumption in case of a multiple regression model is that the explanatory variables are not exactly collinear, i.e. there is no perfect multicollinearity among the various explanatory variables specified in multi-regression model.

Let’s say for the above given population regression function, the sample regression function estimated through the ordinary least square method is given below:

Sample Regression Function be Yi = b1 + b2Xi + ei

OLS estimators (b1 and b2) are BLUE, which means they are the Best Linear Unbiased Estimators which satisfy the below properties:

1. Linearity: b1 and b2 are linear estimators, which mean they are linear functions of the random dependent variable Y. These OLS estimators are linear in the dependent variable and not necessarily linear in the independent variables.

2. Unbiasedness: The OLS estimators are unbiased estimators of their true parameter or population value. Suppose there as many estimators of a population parameter and if among these class of estimators one or more coincides with the original value of the parameters, then these estimators as called as unbiased estimators of the parameters:

E(b1) = B1 and E(b2) = B2

3. Efficiency: Let’s say there exists two estimators T1 and T2; if Var(T1) is less than Var(T2) then T1 is an efficient estimator than T2. The OLS estimator’s b1 and b2 are efficient estimators, which mean they are the best and most efficient estimators among the class of all other estimators of B1 and B2, i.e they have the minimum variance.

4. Consistency: An estimator is said to be consistent if they are asymptotically unbiased and also their variance converges to zero if sample size increases. Both these conditions hold true in the case of the OLS estimators.


Related Solutions

What does the Gauss-Markov theorem claim? Explain the assumptions needed for the Gauss-Markov Theorem.
What does the Gauss-Markov theorem claim? Explain the assumptions needed for the Gauss-Markov Theorem. Are all the assumptions necessary to construct the OLS estimates of the intercept and slope coefficients.
The Gauss Markov Theorem says a) Under the LS assumptions, the OLS estimator has the smallest...
The Gauss Markov Theorem says a) Under the LS assumptions, the OLS estimator has the smallest variance among all linear unbiased estimators b) Under the LS assumptions, the OLS estimator has the smallest variance among all linear estimators c) The OLS estimator has the smallest variance among all linear unbiased estimators d) Under the LS assumptions, the OLS estimator is the most consistent estimator of all linear unbiased estimators
The Gauss-Markov theorem says that the OLS estimator is the best linear unbiased estimator.
The Gauss-Markov theorem says that the OLS estimator is the best linear unbiased estimator. Explain which assumptions are needed in order to verify Gauss-Markov theorem? Consider the Cobb-Douglas production function
The Gauss-Markov theorem will not hold if _____.
The Gauss-Markov theorem will not hold if _____.
What is Gauss-Markov theorem? Why is it so important?
What is Gauss-Markov theorem? Why is it so important?
. Under the Gauss-Markov assumptions, we know that the Ordinary Least Squares (OLS) estimator βˆ is...
. Under the Gauss-Markov assumptions, we know that the Ordinary Least Squares (OLS) estimator βˆ is unbiased, efficient, and consistent. However, if the assumption that E[ϵ 2 i |X] = σ 2 i = σ 2 is violated while the assumption of E[ϵi , ϵj ] = 0 , ∀i ̸= j holds, that the least squares estimator is unbiased but is no longer efficient. The Generalized Least Squares (GLS) estimator, in this case, may be unbiased, consistent, and efficient...
For each of the Gauss-Markov Theorem assumptions (A0-A3), do the following: (a) State the assumption in...
For each of the Gauss-Markov Theorem assumptions (A0-A3), do the following: (a) State the assumption in mathematical terms. (b) State the intuitive meaning of the assumption. (c) For bonus points, state what might cause the assumption to be violated.
Prove that the OLS estimator is efficient provided the Gauss Markov assumptions hold.
Prove that the OLS estimator is efficient provided the Gauss Markov assumptions hold.
Problem 3 Gauss Markov Theorem (i) Write down the formula for the standard error of the OLS estimate, , from a multiple linear regression.
  Problem 3 Gauss Markov Theorem (i) Write down the formula for the standard error of the OLS estimate, , from a multiple linear regression. (ii) In your answer io part (i), what is ? What is the íurmla for   (iii) State the Gauss Markov Theorem, including the assumptions (i.e., write E[u|x1,...,xk] not MLR4)
Explain the Gauss-Markov assumptions required for unbiasedness and efficiency of the OLS estimator. Distinguish between the...
Explain the Gauss-Markov assumptions required for unbiasedness and efficiency of the OLS estimator. Distinguish between the assumptions for simple and multiple linear regressions. Provide examples of violations of each assumption. Under what circumstances are coefficient estimates from MLR and SLR identical?
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT