TS Module 2 practice problems variances and covariances


TS Module 2 practice problems variances and covariances

Author
Message
NEAS
Supreme Being
Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)Supreme Being (5.9K reputation)

Group: Administrators
Posts: 4.5K, Visits: 1.6K

TS Module 2 Time series concepts practice problems

(The attached PDF file has better formatting.)

Time series practice problems variances and covariances

** Exercise 2.1: Random walk

The time series Yt = Yt-1 + et is a random walk with

ó2e = 0.25 and Yt = 0 for t < 1.

What is the variance of YN?

What is the standard deviation of YN?

Part A:

A random walk is the cumulative sum of a white noise process: YN =

å1 + å2 + … + åN

The error terms

åt are independent, with a variance of 0.25 each.

The variance of YN is N × 0.25.

Part B:

The standard deviation of YN is (N × 0.25) = 0.5 × N.

See equation 2.2.11 on page 13

** Question 2.2: Stationary time series

If a time series is stationary, which of the following is true?

ñ1 = 1

ñk = ñ–k

ñk = ñk+1

ñk > ñk+1

ñk < ñk+1

Answer 2.2: B

See equation 2.3.2 on page 16.

Statement A:

ño = 1, not ñ1

Statements C, D, E:

The absolute value of the autocorrelation for lag k > 0 is never greater than 1, but it can increase or decrease between any two lags. Consider an autoregressive process with

ö1 = 0 and ö2 = 0.5. The autocorrelations for odd lags (k = 1, 3, 5, …) are 0; the autocorrelations for even lags (k = 2, 4, 6, …) are 0.5 k/2. The autocorrelation for lag k decreases from k = 0 to 1, increases from k = 1 to 2, decreases from k = 2 to 3, and so forth.

** Exercise 2.3: Equally weighted moving average

Let Yt = ½ × (

åt + åt-1). The variance of åt is ó2å

What is

ãt,t, the variance of Yt?

What is

ãt,t-1, the covariance of Yt and Yt-1?

What is

ñt,t-1, the correlation of Yt and Yt-1?

Part A:

The variance of Yt is (½)2 ×

ó2å + (½)2 × ó2å = ½ ó2å.

Part B:

The covariance of Yt with Yt-1 = covariance (½ × (

åt + åt-1), ½ × (åt-1 + åt-2) ).

Expanding the product gives four terms. Three of the terms have different

å’s; one has å2t-1.

The

å’s are independent, so the three terms with different å’s have a covariance of zero.

The one term with a non-zero covariance is (½)2 ×

ó2å = ¼ ó2å

Part C:

The correlation of Yt with Yt-1 is the covariance divided by the product of the standard deviations of the two terms. A moving average time series is stationary if the underlying time series is stationary. The

å’s are a stationary white noise process, so the moving average time series is also stationary, and the variances of all terms are the same. The product of the standard deviations of two terms is the variance. The correlation = (¼ ó2å )/ (½ ó2å) = 0.500.

See Cryer and Chan page 15, equation 2.2.16:

ñt,s = 0.5 for |t–s| = 1

** Exercise 2.4: Equally weighted moving average

Let Yt = (

åt + åt-1 + … + åt-(N-1)) / N. åt is a white noise process, and the variance of åt is ó2å

What is

ãt,t, the variance of Yt?

What is

ãt,t-1, the covariance of Yt and Yt-1?

What is

ñt,t-1, the correlation of Yt and Yt-1?

What is

ãt,t-j, the covariance of Yt and Yt-j?

What is

ñt,t-j, the correlation of Yt and Yt-j?

Part A:

The Cryer and Chan text shows the analysis for a two period moving average (N = 2). The homework assignment for this module extends the analysis to longer periods. This practice problem gives the reasoning for any N. Final exam problems gives values for N and

ó2å and test variances, covariances, and correlations.

Yt is a stationary time series, since it is a linear combination of white noise processes. The variance of Yt is the expected value of [ (

åt + åt-1 + … + åt-(N-1)) / N ] × [ (åt + åt-1 + … + åt-(N-1)) / N ].

The

å’s are independent, normally distributed, random variables with means of zero. The expected value of åt × ås = 0 for t s. In the expression above, N terms have t = s and N × (N-1) terms t s, so the expected value of the expression is 1/N2 × ó2å + 1/N2 × ó2å + … + 1/N2 × ó2å = 1/N ó2å.

We express this result as "the variance of the mean of N independent, identically distributed random variables is inversely proportional to N."

Part B:

The covariance of Yt with Yt-1 = covariance [ (

åt + åt-1 + … + åt-(N-1)) / N, (åt-1 + åt-2 + … + åt-N) / N ].

Expanding the product gives N2 terms. (N2 – (N-1) )terms have different

å’s; (N - 1) have å2t-j.

The

å’s are independent, so the terms with different å’s have a covariance of zero.

The (N-1) terms with a non-zero covariance each has an expected value of 1/N2 ×

ó2å.

The covariance for a lag of one period is (N-1)/N2 ×

ó2å

Part C:

The correlation of Yt with Yt-1 is the covariance divided by the product of the standard deviations of the two terms. A moving average time series is stationary if the underlying time series is stationary. The

å’s are a stationary white noise process, so the moving average time series is also stationary, and the variances of all terms are the same. The product of the standard deviations of two terms is the variance. The correlation = [ (N-1)/N2 × ó2å ] / [ N/N2 × ó2å ] = (N-1)/N.

Moving averages are often used because they are more stable than the observed values. Any two observed values are uncorrelated, but two adjacent moving averages are highly correlated.

Part D:

The covariance of Yt with Yt-j has (N-j) terms with a non-zero covariance for j < N. If j N, all terms have a zero covariance.

The (N-j) terms with a non-zero covariance each has an expected value of 1/N2 ×

ó2å.

The covariance for a lag of j period is (N-j)/N2 ×

ó2å

Part E:

The correlation = [ (N-j)/N2 ×

ó2å ] / [ N/N2 × ó2å ] = (N-j)/N.

Jacob:

Are these the moving average processes in ARIMA analysis?

Rachel:

The moving average processes discussed in this course have two differences.

They are weighted averages, where the weights are the

èj parameters.

The weighted average becomes the next element of the time series.

** Exercise 2.5: Random walk

Let Yt be a random walk with zero drift: Yt = Yt-1 +

åt.

The variance of the error term is

ó2e.

The time series starts at t = 1: Yt = 0 for t < 1.

This time series is not stationary.

What is

ãt,t, the variance of Yt for t > 0?

What is

ãt,s, the covariance of Yt and Ys for t < s?

What is

ñt,s, the autocorrelation of Yt and Ys for t < s?

Part A:

Yt is the sum of t independent identically distributed random variables, each with variance

ó2e. The variance of Yt is t × ó2e.

Part B:

The covariance is the expected value of (

å1 + å2 + … + åt) × (å1 + å2 + … + ås). These random variables are independent, so only t terms have a non-zero covariance (since t < s). The covariance ãt,s is t × ó2e.

Part C:

The correlation is the covariance divided by the standard deviations of the terms, or the square roots of the variances of the terms: (t ×

ó2e) / [ t × ó2e × s × ó2e ]0.5 = (t/s).

(See Cryer and Chan equations 2.2.12 and 2.2.13 on page 13)

Page 13, Equation 2.2.12:

Page 13, Equation 2.2.13: The autocorrelation function for a random walk is

Stationary processes differ from non-stationary processes. Cryer and Chan write on page 16, equation 2.3.1:

"For a stationary process, the coverage between Yt and Ys depends on time only through the time difference |t – s| and not otherwise on the actual times t and s. For a stationary process, we can simplify our notation and write

ñk = Corr (Yt, Yt-k)."

ARMA processes apply to stationary processes. A random walk is not stationary, so the autocorrelation depends on the actual times t and s.

Jacob:

Yt is a set of observations, such as {100, 102, 101, 98, 101, 103, 100, …}

I understand what is meant by

ñt,t-1: we examine pairs of consecutive observations to see if they are related.

For the time series above,

ñt,t-1 is the correlation of two series:

{100, 102, 101, 98, 101, 103, 100, …} and {102, 101, 98, 101, 103, 100, …}.

Here t ranges over all integers from 2 to infinity, and t-1 ranges from 1 to infinity.

But

ñt,s is the autocorrelation of two specific observations. If t = 4 and s = 7, these are observations 4 and 7. Two scalars do not have a correlation. In the illustration above, the correlation of 98 and 100 has no meaning.

Rachel:

You are treating the observed values as the random variables. In fact, the observed values are one simulation of the random variables. The meaning of

ñt,s is as follows. We simulate the random walk Yt 100,000 times. From each simulation, we take observations t and s. These observations are random variables, not scalars. We have two series, each with 100,000 values. The Nth elements of the two series are observations 4 and 7 from a random walk. All the random walks have the same parameters (drift and volatility), but they each have different values. The correlation of these two series is ñt,s.

Jacob:

Was I correct about

ñt,t-1?

Rachel:

You explained the sample autocorrelation of an empirical time series. The sample autocorrelation is an estimate of the true autocorrelation of the time series process. In this chapter, Cryer and Chan deal with the theoretical autocorrelation of the process, not with estimating the autocorrelation from empirical data.

** Question 2.6: Weakly stationary time series

All but which of the following are true for a weakly stationary time series Yt?

The mean function is constant over time.

ã

t,t = ãt-1,t-1

Covariance (Yt-2, Yt-4) = Covariance (Yt-3, Yt-1)

Correlation (Yt+1, Yt-1) = Correlation (Yt, Yt+2)

Correlation (Yt+1, Yt-1) = Correlation (Yt-2, Yt+2)

Answer 2.6: E

See page 17, top of page, where Cryer and Chan has two attributes of weakly stationary time series:

The mean function is constant over time.

ã

t,t-k = ã0,k for all time t and lag k.

Choice E says the correlation of lag 2 = the correlation of lag 4. This is not correct.

Jacob:

How might one summarize a stationary time series?

Rachel:

The covariance depends only on the lag between the two observations.

Jacob:

A white noise process is stationary. But the process is stochastic. Perhaps observations 1 and 2 are similar, but observations 4 and 5 are not similar. Aren’t the covariances different?

Rachel:

You are thinking of the time series as a set of known observations. The proper perspective is that the time series is a random draw from all sets of observations with certain attributes. If we examine all sets of observations with the given attributes, the covariance of the first and second elements is the same as the covariance of the fourth and fifth elements.

** Exercise 2.7: Random walk

Yt is a random walk: Yj =

å1 + å2 + … + åj, with ó2å = 1.

The autocorrelation of observations t and s,

ñt,s, with t < s, is ½ (t and s are the tth and sth observations).

What is

ñ2t,s, the autocorrelation of observations 2t and s?

What is

ñt,2s, the autocorrelation of observations t and 2s?

What is

ñt,t+s, the autocorrelation of observations t and t+s?

What is

ñs,t+s, the autocorrelation of observations s and t+s?

Part A:

ñt,s = (t/s) = ½ t/s = ¼ (2t)/s = ½ ñ2t,s = (2t/s) = (½) = 0.707.

Jacob:

Perhaps 2t is larger than s, so we should use (s/2t).

Rachel:

t/s = ¼, so s = 4t; 2t is ½ × s.

Jacob:

Do final exam problems use other parameters besides 2?

Rachel:

Yes. For a parameter of k (instead of 2), use the smaller of (kt)/s and s/(kt).

ñ

3t,s = (3t/s) = (¾) = 0.866.

ñ

4t,s = (4t/s) = 1 = 1.000.

ñ

5t,s = (s/5t) = (4/5) = 0.894.

ñ

6t,s = (s/6t) = ( ) = 0.817.

Part B:

ñt,s = (t/s) = ½ t/s = ¼ t/(2s) = ñ2t,s = (t/2s) = ( ) = 0.354.

Part C:

ñt,t+s = (t/(t+s)) = (1/5) = 0.447.

Jacob:

Can one write this as a formula?

Rachel:

t/(t+s) = 1 / [(t+s) / t ] = 1 / (1 + 1 / (t/s) )

Part D:

ñs,t+s = (s/(t+s)) = (4/5) = 0.894.

Jacob:

Can one write this as a formula?

Rachel:

s/(t+s) = 1 / [ (t+s) / s ] = 1 / (1 + t/s)


Attachments
TS Module 2 pps df.pdf (2.6K views, 99.00 KB)
Edited 11 Years Ago by NEAS
hs1234
Forum Newbie
Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)

Group: Forum Members
Posts: 7, Visits: 51

For question 2.4, why is Yt – 2 Yt-1 + Yt-2 stationary?  Is it because all the t's cancel out?  The tip, "See Cryer and Chan page 20, exercise 2.9" is not helpful at all b/c the exercises are not worked out anywhere in the book.

[NEAS: Choice B is the second difference. Yt is a quadratic function of t, so the second difference is a constant. The second difference of X is stationary.]


RayDHIII
Forum Member
Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)

Group: Forum Members
Posts: 39, Visits: 138

"The basic idea of stationarity is that the probability laws that govern the behavior of the process do not change over time." (C&C 16)

"A Stochastic process {Yt} is said to be weakly (or second-order) stationary if

      1. The mean function is constant over time, and

      2. gammat,t-k = gamma0,k  for all time t and lag k" (C&C 17)

What do you get when you solve equation B?  What is the mean function of that equation?  If it contains only constants and stationary time series, then by definition, it is stationary.

For example, exercise 2.9 on page 20 wants us to show that Yt = beta0 + beta1t + Xt is not stationary:

E(Yt) = E(beta0 + beta1t + Xt) = E(beta0) + (t)E(beta1) + E(Xt) = beta0 +beta1t + 0 (we were given that {Xt} is a zero-mean stationary time series and the beta's are constants)

What happens to E(Yt) as t increases?  That is the intuition behind this practice problem.  Let me know if you have further questions.

RDH


jpstyle99
Forum Newbie
Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)

Group: Forum Members
Posts: 3, Visits: 80
I get 2*beta2 when I work out the mean function for equation B. Is that correct?
RayDHIII
Forum Member
Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)Forum Member (43 reputation)

Group: Forum Members
Posts: 39, Visits: 138

Hopefully.  The importance of this assignment is to realize when you have stationarity.  If your mean function has values that depend on time (i.e., t), then you do not have a stationary process.

RDH

[NEAS: Correct]


akoch81
Forum Newbie
Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)Forum Newbie (4 reputation)

Group: Forum Members
Posts: 5, Visits: 418
For NEAS: In the solution to question 2.2, I would recommend changing the wording under "Statements C, D, E", particularly the last sentence, to "The autocorrelation for lag k decreases from k=0 to 1, increases from k=1 to 2, decreases from k=2 to 3, and
so forth."
Otherwise, it sounds like the value of the autocorrelation changes to those values, which would contradict the first part of the explanation "The absolute value of the autocorrelation for lag k > 0 is never greater than 1."

[NEAS: We have done so.]
Edited 11 Years Ago by NEAS
GO
Merge Selected
Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...





Reading This Topic


Login
Existing Account
Email Address:


Password:


Social Logins

  • Login with twitter
  • Login with twitter
Select a Forum....













































































































































































































































Neas-Seminars

Search