In: Statistics and Probability
2. Consider the same model in a time series context, namely, yt = β0 + β1xt + ut, t = 1, . . . , T where ut = ρut−1 + vt, |ρ| < 1, vt is i.i.d. with E(vt) = 0 and Var(vt) = σ 2 v . (a) What is the problem in using OLS to estimate the model? Is there any problem in hypothesis testing? (b) Show that Cov(ut, ut−τ ) = ρ τVar(ut−τ ) for τ = 0, 1, 2, . . . . (c) How would you test the hypothesis that ρ = 0 using an alternative of your choice? (d) How would you estimate the model correcting for autocorrelation if you know that ρ = .55? Show all the steps. (e) Can you do part (d) above if ρ is unknown? Explain the steps.
Let's consider the following model :
where t=1,2,,,T
and
,
,
(c) To test for the presence of autocorrelation using the hypothesis Ho: rho =0 vs say
H1: rho not equal to 0
by
where observed residual at t th time point of the dataset
*** Please rate