In: Economics
Consider a simple linear regression model with time series data: yt=B0+ B1xt +ut t= 1;2,.....T
Suppose the error ut is strictly exogenous. That is E(utIx1;....xt,.....xT) = 0
Moreover, the error term follows an AR(1) serial correlation model. That is, ut= put-1 +et t= 1;2,.....T (3)
where et are uncorrelated, and have a zero mean and constant variance.
a. [2 points] Will the OLS estimator of B1 be unbiased? Why or why not?
b. [3 points] Will the conventional estimator of the variance of the OLS estimator be unbiased? Why or why not?
c. [5 points] Explain in detail how you will test for serial correlation in ut using a t-test. [Hint: Your null hypothesis is that p= 0 in equation 3.]
Solution:
Given
a simple linear regression model with time series data:
yt = B0 + B1xt + ut , t= 1;2,.....T
error ut is strictly exogenous.
That is E(ut I x1;....xt,.....xT) = 0
AR(1) serial correlation model.
That is, ut = put-1 + et , t= 1;2,.....T
a. OLS estimator of B1 be unbiased or not:
unbiasedness is a finite sample property, and if it held it would be expressed as
?(?̂ )=?E(β^)=β
(where the expected value is the first moment of the finite-sample distribution)
b. conventional estimator of the variance of the OLS estimator be unbiased or not:
while consistency is an asymptotic property expressed as:
plim?̂ =?plimβ^=β
The OP shows that even though OLS in this context is biased, it is still consistent.
c. test for serial correlation in ut using a t-test:
If r = 0,
then t t u = e and in that case the random errors t u satisfy
Assumption 4, i.e. there is no serial correlation.
Hence
a test for serial correlation is a test of : 0 H0 r =
?(?̂ )≠?butplim?̂ =?E(β^)≠βbutplimβ^=β
No contradiction here.