Question

In: Economics

Write the autocorrelation functions for low-order MA and AR models.

Write the autocorrelation functions for low-order MA and AR models.

Solutions

Expert Solution

ACF and PACF plots: After a time series has been stationarized by differencing, the next step in fitting an ARIMA model is to determine whether AR or MA terms are needed to correct any autocorrelation that remains in the differenced series. Of course, with software like Statgraphics, you could just try some different combinations of terms and see what works best. But there is a more systematic way to do this. By looking at the autocorrelation function (ACF) and partial autocorrelation (PACF) plots of the differenced series, you can tentatively identify the numbers of AR and/or MA terms that are needed. You are already familiar with the ACF plot: it is merely a bar chart of the coefficients of correlation between a time series and lags of itself. The PACF plot is a plot of the partial correlation coefficients between the series and lags of itself.

In general, the "partial" correlation between two variables is the amount of correlation between them which is not explained by their mutual correlations with a specified set of other variables. For example, if we are regressing a variable Y on other variables X1, X2, and X3, the partial correlation between Y and X3 is the amount of correlation between Y and X3 that is not explained by their common correlations with X1 and X2. This partial correlation can be computed as the square root of the reduction in variance that is achieved by adding X3 to the regression of Y on X1 and X2.

A partial autocorrelation is the amount of correlation between a variable and a lag of itself that is not explained by correlations at all lower-order-lags. The autocorrelation of a time series Y at lag 1 is the coefficient of correlation between Yt and Yt-1, which is presumably also the correlation between Yt-1 and Yt-2. But if Yt is correlated with Yt-1, and Yt-1 is equally correlated with Yt-2, then we should also expect to find correlation between Yt and Yt-2. In fact, the amount of correlation we should expect at lag 2 is precisely the square of the lag-1 correlation. Thus, the correlation at lag 1 "propagates" to lag 2 and presumably to higher-order lags. The partial autocorrelation at lag 2 is therefore the difference between the actual correlation at lag 2 and the expected correlation due to the propagation of correlation at lag 1.

Here is the autocorrelation function (ACF) of the UNITS series, before any differencing is performed:

The autocorrelations are significant for a large number of lags--but perhaps the autocorrelations at lags 2 and above are merely due to the propagation of the autocorrelation at lag 1. This is confirmed by the PACF plot:

AR and MA signatures: If the PACF displays a sharp cutoff while the ACF decays more slowly (i.e., has significant spikes at higher lags), we say that the stationarized series displays an "AR signature," meaning that the autocorrelation pattern can be explained more easily by adding AR terms than by adding MA terms. You will probably find that an AR signature is commonly associated with positive autocorrelation at lag 1--i.e., it tends to arise in series which are slightly underdifferenced. The reason for this is that an AR term can act like a "partial difference" in the forecasting equation. For example, in an AR(1) model, the AR term acts like a first difference if the autoregressive coefficient is equal to 1, it does nothing if the autoregressive coefficient is zero, and it acts like a partial difference if the coefficient is between 0 and 1. So, if the series is slightly underdifferenced--i.e. if the nonstationary pattern of positive autocorrelation has not completely been eliminated, it will "ask for" a partial difference by displaying an AR signature. Hence, we have the following rule of thumb for determining when to add AR terms:

  • Rule 6: If the PACF of the differenced series displays a sharp cutoff and/or the lag-1 autocorrelation is positive--i.e., if the series appears slightly "underdifferenced"--then consider adding an AR term to the model. The lag at which the PACF cuts off is the indicated number of AR terms.

In principle, any autocorrelation pattern can be removed from a stationarized series by adding enough autoregressive terms (lags of the stationarized series) to the forecasting equation, and the PACF tells you how many such terms are likely be needed. However, this is not always the simplest way to explain a given pattern of autocorrelation: sometimes it is more efficient to add MA terms (lags of the forecast errors) instead. The autocorrelation function (ACF) plays the same role for MA terms that the PACF plays for AR terms--that is, the ACF tells you how many MA terms are likely to be needed to remove the remaining autocorrelation from the differenced series. If the autocorrelation is significant at lag k but not at any higher lags--i.e., if the ACF "cuts off" at lag k--this indicates that exactly k MA terms should be used in the forecasting equation. In the latter case, we say that the stationarized series displays an "MA signature," meaning that the autocorrelation pattern can be explained more easily by adding MA terms than by adding AR terms.


Related Solutions

Explain in depth the stationarity conditions for autoregressive (AR) model and moving average (MA) models.
Explain in depth the stationarity conditions for autoregressive (AR) model and moving average (MA) models.
Using R: 1. Generate AR(1), AR(2), MA(1), MA(2), and ARMA(1,1) processes with different parameter values, and...
Using R: 1. Generate AR(1), AR(2), MA(1), MA(2), and ARMA(1,1) processes with different parameter values, and draw ACF and PACF. Discuss the characteristics of ACF snd PACF for these processes. 2. Generate AR(1) process {X_t}. Compute the first difference Y_t = X_t - X_(t-1). Draw ACF and PACF of {Y_t}. What can you say about this process? Is it again a AR(1) process? What can you say in general? 3.For the AR(2) processes with the following parameters, determine if AR(2)...
A) Describe an AR and an MA model. How are they related? (15) B) Explain the...
A) Describe an AR and an MA model. How are they related? (15) B) Explain the Hoderick-Prescot filter. (15) C) Explain how you would identify an appropriate ARIMA structure for a time series model. (20)
3.Write the complete model for the following: (a) AR(P = 2)d=12 (b) MA(Q = 2)d=12 (c)...
3.Write the complete model for the following: (a) AR(P = 2)d=12 (b) MA(Q = 2)d=12 (c) ARMA(P = 1, Q = 2)d=12 (d) ARMA(P = 2, Q = 0)d=12 (e) ARMA(p = 0, q = 2) × (P = 1, Q = 2)d=12 (f) SARIMA(p = 1, d = 1q = 1) × (P = 1, D = 1, Q = 1)d=12 All I know is P, D, Q indicate seasonality. p,d,q do not. Some of these are mixed model...
From the following observations of annual EPS for a company, what is the first-order autocorrelation? Time...
From the following observations of annual EPS for a company, what is the first-order autocorrelation? Time EPS 1 3.48 2 3.23 3 4.01 4 4.58 5 5.98 6 5 Note that using the CORREL() spreadsheet function will not produce the correct result. Though for a large sample it'll be really close, for a small sample such as this one the difference can be significant. This is because for the autocorrelation you use the variance of the full sample in the...
What is First Order Autocorrelation and what effect does it have on the Least Squared Estimates...
What is First Order Autocorrelation and what effect does it have on the Least Squared Estimates of Linear regression model?
From the following observations of annual EPS for a company, what is the first-order autocorrelation? Time...
From the following observations of annual EPS for a company, what is the first-order autocorrelation? Time EPS 1 3.06 2 3.27 3 4.44 4 4.07 5 5.26 6 5.59 Note that using the CORREL() spreadsheet function will not produce the correct result. Though for a large sample it'll be really close, for a small sample such as this one the difference can be significant. This is because for the autocorrelation you use the variance of the full sample in the...
1. What is First Order Autocorrelation and what effect does it have on the Least Squared...
1. What is First Order Autocorrelation and what effect does it have on the Least Squared Estimates of Linear regression model?
Describe the Cochrane-Orcutt Iterative procedure to correct for first order autocorrelation in the model Yt =...
Describe the Cochrane-Orcutt Iterative procedure to correct for first order autocorrelation in the model Yt = B0 + B1X1 + B2X2 + ut
c++ Write a program that displays the status of an order. a) Program uses 2 functions...
c++ Write a program that displays the status of an order. a) Program uses 2 functions (in addition to main ()). b) The first function asks the user for the data below and stores the input values in reference parameters. c) Input data from user: # of spools ordered, # of spools in stock, any special shipping & handling charges over and above the $10 rate. d) The second function receives as arguments any values needed to compute and display...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT