TS Module 10 Sample Autocorrelation Function


TS Module 10 Sample Autocorrelation Function

Author
Message
NEAS
Supreme Being
Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)

Group: Administrators
Posts: 4.3K, Visits: 1.3K

TS Module 10 Sample autocorrelation functions practice problems

 

(The attached PDF file has better formatting.)

 

Question 1.1: Sample Autocorrelation Function

 

The sample autocorrelation of lag k ≈ (–0.366)k for all k > 1, and the sample autocorrelation of lag 1 is –0.900.  The time series is most likely which of the following choices?

 


A.    AR(1)

B.     MA(1)

C.     ARMA(1,1)

D.    ARIMA(1,1,1)

E.     A random walk

 

Answer 1.1: C

 

A stationary autoregressive model has geometrically declining autocorrelations for lags more than its order. If the order is p, the lags for p+1 are higher are geometrically declining.  This is true here, so we presume an AR(1) process.

 

If the time series is AR(1), the sample autocorrelation for lag 1 should be about –0.366.  It is –0.900, so we assume the series also has a moving average component of order 1.

 

 


 

Question 1.2: Sample Autocorrelation Function

 

For a time series of 1,600 observations, the sample autocorrelation function of lag k is ≈ 0.366 × 1.2–k for k < 4.  For k ≥ 4, the sample autocorrelations are normally distributed with a mean of zero and a standard deviation of 2.5%.  The time series is probably

 


 

A.    Stationary and Autoregressive of order 3

B.     Stationary and Moving Average of order 3

C.     Non-stationary

D.    A random walk with a drift for three periods

E.     A combination of stationary autoregressive of order 3 and a white noise process

 

Answer 1.2: B

 

Statement B: For k ≥ 4 and 1,600 observations, the sample autocorrelations are normally distributed with a mean of zero and a standard deviation of 2.5%; these are the sample autocorrelations of a white noise process. A moving average time series has sample autocorrelations that drop off to a white noise process after its order (3 in this problem).

 

Statement A: An autoregressive process has geometrically declining autocorrelations for lags greater than its order.

 

Statements C and D: A non-stationary time series would not have autocorrelations that drop off to a random walk after 3 periods.  A random walk is not stationary.

 

Statement E: A stochastic time series has white noise built in; adding white noise doesn’t change anything.

 

 


 

Question 1.3: Sample Autocorrelation Function

 

If the sample autocorrelations for a time series of 1,600 observations for the first five lags are 0.461, 0.021, –0.017, 0.025, and –0.009, the time series is most likely which of the following choices?

 


 

A.    AR(1) with ö1   0.45

B.     MA(1)

C.     ARMA(1,1) with ö1   0.45

D.    ARIMA(1,1,1) with ö1   0.45

E.     A random walk

 

Answer 1.3: B

 

The sample autocorrelations decline to zero after the first lag, with random fluctuations within  the white noise limits.  The process is presumably moving average of order 1.

 

The process could also have an AR(1) parameter with φ1 < 0.15, but we have no reason to assume an autoregressive parameter.  If φ1 ≈ 0.45, the sample autocorrelation of lag 2 should be significantly more than zero.

 

 

 


 

Question 1.4: Covariances

 

We examine ã0, ã1, and ã2, the covariances from a stationary time series for lags of 0, 1, and 2. Which of the following is true?

 

ã0 is the variance, which is constant for a stationary time series, so the autocorrelations are the covariances divided by the variance.  The autocorrelations have a maximum absolute value of one, and the variance is positive.

 


 

A.    ã0   ã1

B.     ã1   ã2

C.     ã2   ã1

D.    ã1 + ã2   ã0

E.     If ã1   0, ã2   0

 

Answer 1.4: A

 

The covariances of the time series can increase or decrease with the lag.

 

Illustration: For an MA(q) process with θj = 0 for 1 ≤ j ≤ q-1, the covariances are 0 for lags of 1 through q-1 but non-zero for a lag of q.

 

The variances of all elements of a stationary time series are the same, so none of the covariances can exceed the variance.

 

All five choices can be true.  Only choice A is always true.

 

 

 

 

 


 

Question 1.5: Sample Autocorrelation Function

 

A time series is {8, 12, 11, 16, 10, 14, 9, 14, 13, 13, 9, 15}.  What is ,

the sample autocorrelation function at lag 2?  Use the data in the table below.

 

T

Yt

(3)

(4)

(5)

1

8

-4

4

16

2

12

0

0

0

3

11

-1

2

1

4

16

4

8

16

5

10

-2

6

4

6

14

2

4

4

7

9

-3

-3

9

8

14

2

2

4

9

13

1

-3

1

10

13

1

3

1

11

9

-3

 

9

12

15

3

 

9

Total

144

0.000

23.00

74.000

 


 

                       Column 3 is yt

           Column 4 is (yt) × (yt+2)

           Column 5 is (yt)2


 

 


 

A.    –0.311

B.     –0.120

C.     +0.121

D.    +0.311

E.     +1.120

 

Answer 1.5: D

 

 = ∑yt / 12 = 144/12 = 12.000

 

= [ ∑ (yt) × (yt+2) ] / ∑ (yt)2 = 23 / 74 = 0.311

 


Attachments
Michelle2010
Junior Member
Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)

Group: Forum Members
Posts: 18, Visits: 1

I have a question on 10.5.  I understand that an ARMA(3,2) and ARMA(3,3) process should ahve sample autocorrelations that decline geometrically after a lag of 3 periods.  What would an ARMA(p,q) process with p<q look like?  Would it just not begin declining geometrically until a lag of q periods?

[NEAS: Correct.]


Michelle2010
Junior Member
Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)

Group: Forum Members
Posts: 18, Visits: 1

I also have a question on 10.6.  Why is

Var(rk) for a stationary ARMA process proportional to 1/N?  I thought this was only true for white noise processes.

I understand the comment in the solution that this result is consistent with the variance of a sample mean, however I don't really get how the variance of a sample mean is related to the variance of the sample autocorrelation.

[NEAS: For a white noise process, the sample autocorrelations are a normal distribution with a variance proportional to 1/N. For an ARMA process with large N, the means are governed by the model. If the model is correct, the sample autocorrelations have the same distribution.]


Adrian
Forum Newbie
Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)

Group: Forum Members
Posts: 8, Visits: 1
I think that's correct.  That's how I understand it anyway.
Adrian
Forum Newbie
Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)Forum Newbie (8 reputation)

Group: Forum Members
Posts: 8, Visits: 1
I don't think that question 10.6 is talking about Var(rk).  Did I miss something?
Michelle2010
Junior Member
Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)Junior Member (19 reputation)

Group: Forum Members
Posts: 18, Visits: 1
Oops...I should have posted this under the TS Module 11 - Computing and using sample autocorrelations section.  I'll repost.  Thanks!
GO
Merge Selected
Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...





Reading This Topic


Login
Existing Account
Email Address:


Password:


Social Logins

  • Login with twitter
  • Login with twitter
Select a Forum....











































































































































































































































Neas-Seminars

Search