In: Statistics and Probability
(a) Why do we have 96 observations for fitting this model, even though we need prior values of the response to find the change and the lagged predictor? Shouldn€™t we lose one observation from lagging the variable?
(b) How do the slope and intercept of this equation differ from those for the frst-order autoregression of the level of shipments on its lag (shown in Table 27.3)?
(c) Compare se from this regression to that of the autoregression for the level of shipments. Explain any differences or similarities.
(d) This model has a small R2 with a slope that is not statistically significant. Why is the ft of this model so poor, whereas that of the AR(1) model is so impressive?
(a)
Use data from before 2002 to find the change and lag.
(b)
The intercepts match and the sum of the slopes equals 1.
(c)
The SD of the residuals se is the same because the residuals are the same. A residual in the model for shipments yt is
et = yt − ŷ t = yt − (0.9 + 0.97yt − 1)
= (yt − yt − 1) − (0.9 – 0.03yt−1)
which is a residual in the model for the changes in shipments.
(d)
Most of the structure explained by the autoregression in the text is contained in the proximity of yt to yt−1. By differencing the response, we’ve removed the easy part of the forecast. This model for the differences has to explain the change in the series, and that’s much harder.
The t-statistic for the slope in the regression of yt on yt−1 compares the estimate to 0; in a sense, the t-statistic in the regression of changes on yt−1 is comparing the slope to 1 and is telling us that the estimate is not far from 1.