TS Module 10 Sample autocorrelations practice problems


TS Module 10 Sample autocorrelations practice problems

Author
Message
NEAS
Supreme Being
Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)

Group: Administrators
Posts: 4.3K, Visits: 1.3K

TS Module 10 Sample autocorrelations practice problems

 

(The attached PDF file has better formatting.)

 

 

** Exercise 10.1: Sample Autocorrelation Function

 


        A time series has 1,600 observations.

        The sample autocorrelation of lag k   (–0.5)k for k > 1.

        The sample autocorrelation of lag 1 is –0.900.


 

 

Explain why which of the following choices is or is not a proper model for this time series.

 


 

A.    A white noise process

B.    A random walk

C.    MA(1)

D.    AR(1)

E.    ARMA(1,1)

F.    ARIMA(1,1,1)

 

Part A: For a white noise process with 1,600 observations, the sample autocorrelations are normally distributed with a mean of zero and a standard deviation of 2.5%. The observed (sample) autocorrelations  are significantly different from zero.

 

Part B: For a random walk with 1,600 observations, the sample autocorrelations are close to one, and decline slowly (in absolute value). The sample autocorrelation of lag 1 is –0.900, which is significantly different from one (for a sample of size 1,600). More important, the decline for higher lags is too rapid.

 

Part C: If the time series is MA(1), the autocorrelation function drops to zero for lags > 1. For this time series, the sample autocorrelations have geometric decay after lag 1, indicating an autoregressive component.

 

Part D: If the time series is AR(1), the estimated value of ö is –0.500, which is the ratio of adjacent sample autocorrelations. The sample autocorrelation for lag 1 should be about –0.500.  It is –0.900, so we assume the time series also have a moving average component of order 1.

 

Part E: The autoregressive component of order p = 1 gives autocorrelations of lag k ≈ (–0.5)k for k ≥ 1. The discrepancy for lag 1, whose sample autocorrelation is –0.900, is best explained by a moving average term.

 

Part F: An ARIMA(1,1,1) process, like a random walk, which is an ARIMA(1,1,0) process, has autocorrelations close to one.

 

 


 

** Exercise 10.2: Sample Autocorrelation Function

 

For a time series of 1,600 observations, the sample autocorrelation function of lag k is ≈ 0.800 × 1.2–k for k < 3.  For k ≥ 3, the sample autocorrelations are normally distributed with a mean of zero and a standard deviation of 2.5%. 

 


 

A.    Is the time series stationary or non-stationary?

B.    Does the time series have an autoregressive component?

C.    Does the time series have a moving average component?

D.    What is the order of the autoregressive or moving average component?

 

Part A: The sample autocorrelation function declines to zero by the fourth lag, so the process is stationary .

 

Part B: If the time series had an autoregressive component, the sample autocorrelation function would decline geometrically after the greater of p and q, where p is the order of the autoregressive part and q is the order of the moving average part. More precisely, the autocorrelation function would be encased within an envelop with this geometric decay.

 

Jacob: What do you mean by an envelop?

 

Rachel: Consider an AR(2) process with ö1 = 0 and ö2 = 0.800. The autocorrelations for lags 1, 2, 3, … are 0, 0.800, 0. 0.640, 0, 0.512, 0, …. This process is not geometric decay, but it is enclosed within the process 0.894, 0.800, 0.716, 0.640, … = 0.800k/2.

 

Jacob: The process in this exercise seems similar. The first three lags show geometric decay. Subsequent lags have sample autocorrelations close to zero, which are enclosed within this geometric decay envelop.

 

Rachel: We choose the simplest model. If the time series had an autoregressive component, it would have to have moving average parameters for lags k ≥ 3 to offset the autoregressive correlations. Moving average parameters for lags > 2 are not common. It is more reasonable to assume just two moving average terms.

 

Part C: The time series has sample autocorrelations that are significantly different from zero only for lags 1 and 2. This is a moving average process of order 2.

 

Jacob: Between lags 1 and 2, the sample autocorrelations show geometric decay. Doesn’t this indicate an autoregressive component?

 

Rachel: The sample autocorrelations do not show geometric decay. We can always fit geometric decay to a sample of two points, since the degrees of freedom are zero. The apparent geometric decay in the exercise a red herring (i.e., it is written this way to make you assume an autoregressive process). The first two sample autocorrelations are 0.800 and 0.667. A sample of two points can not indicate geometric decay.

 

Part D: The order of the process is p = 0 and q = 2.

 


 

** Exercise 10.3: Sample Autocorrelation Function

 

The sample autocorrelations for lags 1 through 5 for a time series of 1,600 observations are 0.461, 0.021, –0.017, 0.025, and –0.009.

 


 

A.    Is the time series stationary?

B.    Do the sample autocorrelations decline to zero by geometric decay or drop off suddenly?

C.    What is the most likely process for this time series?


 

 

Part A: A white noise process with 1,600 observations has a standard deviation of 1/√1,600 = 2.5%. The sample autocorrelations for lags 2, 3, 4, and 5 are within one standard deviation of zero, so we assume they reflect random fluctuations. The sample autocorrelations are not significantly different from zero after the first lag, so the process is stationary.

 

Part B: If the sample autocorrelations followed geometric decay, the second sample autocorrelation would be approximately 0.4612 = 0.213. The difference from 0.021 is greater than two standard deviations. Geometric decay is not likely. Rather, the sample autocorrelations drop off to zero after lag 1.

 

Part C: The most likely process is MA(1).

 


 

** Exercise 10.4: Sample autocorrelation function

 

The first six sample autocorrelations for an ARMA(2,2) process with 1,600 values are:

 

Lag (k)

1

2

3

4

5

6

Autocorrelation

0.6%

32.4%

-0.7%

15.8%

0.4%

8.0%

 

You infer that the autoregressive coefficients are zero for lag 1 but non-zero for lag 2 and the moving average coefficients are zero for lag 1 but non-zero for lag 2.

 


 

A.    Explain the rationale for this inference?

B.    What is the expected autocorrelation for lag 9?

C.    What is the expected autocorrelation for lag 10?

 

Part A: The sample autocorrelations for a white noise process with 1,600 observations is 1/√1,600 = 2.5%. The sample autocorrelations for this time series of lags 1, 3, and 5 are not significantly different from zero. We infer that ö1 = 0 and è1 = 0.

 

Jacob: Perhaps ö1 = è1, but both are non-zero. If the observed value for yt increases one unit, the error term for Period t increase one unit, and the expected value for Period t+1 increases 1 × ö1 – 1 × è1 = 0.

 

Rachel: If this scenario, the sample autocorrelations for lags 3 and 5 would be non-zero. Moreover, we use the principle of parsimony: unless we have reason to assume a parameter is zero, we assume it is zero.

 


 

        The even sample autocorrelations geometrically decline with a factor of about 50%.

        The odd sample autocorrelations are about zero, indicating that ö1   zero. 

        The future odd sample autocorrelations should also be about zero.

        The future even sample autocorrelations should be geometrically declining at the same pace.


 

 

Part B: ö2 affects only autocorrelations of even lags (2, 4, 6, …). Since ö1 is 0, ñ9 = 0.

 

Part C: Since ö1 is 0, ö2 = ñ4 / ñ2 = ñ6 / ñ4. We don’t know the true autocorrelations, but we have sample autocorrelations. r4 / r2 = 15.8% / 32.4% = 0.488 and r6 / r4 = 8.0% / 15.8% = 0.506. The average of these two estimates is 0.497 ≈ 0.500, so we estimate ñ10 as 8% × 50% × 50% = 2%.

 

Jacob: The exercise says this is an ARMA(2,2) process. Where did we use è2? Could this also be an AR(2) process?

 

Rachel: If this were an AR(2) process with ö1 = 0 and ö2 = 0.500, r2 should be about 0.500. Since r2 is 32.4%, we assume a è2 coefficient causes the discrepancy.

 

Note: Final exam problems are multiple choice. You won’t have to calculate the estimates to three decimal places; the best choice is usually obvious.

 


 

*Question 10.5: Sample Autocorrelation Function

 

If the sample autocorrelation function of a time series does not fall off quickly as k increases, we may infer that the time series is probably

 


 

A.    Autoregressive

B.    Moving average

C.    Non-stationary

D.    A random walk

E.    A white noise process

 

Answer 10.5: C

 

Jacob: What if the time series is AR(1) with ö = 0.99. The sample autocorrelations decline very slowly.

 

Rachel: The decline is still geometric decay, which is considered quick decay for time series. The pattern of decay is important, not the numerical values. A non-stationary process shows a pattern with more level sample autocorrelations.

 

Jacob: Can the answer also be D? Random walk sample autocorrelation functions also have slow decay.

 

Rachel: A random walk is one type of non-stationary time series; slow decay doesn’t indicate the process is a random walk. Any ARIMA process is d > 0 is non-stationary.

 


 

** Exercise 10.6: Covariances

 

We examine ã0, ã1, and ã2, the covariances from a stationary time series for lags of 0, 1, and 2.

 

Which of the following is true?

 


 

A.    ã0   ã1

B.    ã1   ã2

C.    ã2   ã1

D.    ã1 + ã2   ã0

E.    If ã1 > 0 then ã2 > 0

 

Statement A: True. ã0 is the variance, which is positive and constant for a stationary time series.

 


 

        The autocorrelations for later lags are the covariances divided by the variance.

        The autocorrelations have a maximum absolute value of one.

        The variance is positive, so ã0   ã1


 

 

Statement B: For most time series observed in practice, ã1ã2. But in theory, the covariances of the time series can increase or decrease with the lag. For an MA(q) process with θj = 0 for 1 ≤ j ≤ q-1, the covariances are 0 for lags of 1 through q-1 but non-zero for a lag of q. For an AR(p) process with θj = 0 for 1 ≤ j ≤ p-1, the covariances are 0 for lags of 1 through p-1 but non-zero for a lag of p.

 

Jacob: Are these processes common?

 

Rachel: They are very common. A monthly time series of a seasonal time series may have an autocorrelation higher at lag 12 than at lags 1 through 11.

 

Jacob: What about for AR(1) and MA(1) processes; is this statement true?

 

Rachel: ã1 may be negative (if ö is negative or è is positive), whereas ã2 is positive (for the AR(1) process) or zero (for the MA(1) process). This also negates statement B.

 

Statement C: This statement is false for most time series observed in practice.

 

Statement D: No rationale exists for this statement. If ö and è are small, this statement is always false.

 

Statement E: ã2 is > 0 for an AR(1) process, regardless of ã1. It is not necessarily > 0 for any other type of model; its value depends on the ö and è parameters.

 

Jacob: What else might final exam problems test?

 

Rachel: Final exam problems test if you understand the concepts in the textbook. The problems may give the type of process. For example, ã2 > 0 for an AR(1) process, but not for other processes.

 


Attachments
GO
Merge Selected
Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...





Reading This Topic


Login
Existing Account
Email Address:


Password:


Social Logins

  • Login with twitter
  • Login with twitter
Select a Forum....











































































































































































































































Neas-Seminars

Search