Questions and Answers on Tme Series Modeling


Questions and Answers on Tme Series Modeling

Author
Message
Chesters Mom
Junior Member
Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)

Group: Forum Members
Posts: 14, Visits: 1

AR(2)--yes I've calculated it by hand.  Look at equations 4.3, 4.4, and 4.5 in the textbook.

Supposed vs. Not supposed--I don't think we are required to be 100% accurate in fitting our series to a model because there might be no perfect answer or more than 1 perfect answer depending on your series.  However, I don't think it is due diligence to force our series to fit into a low-order model just because that's all we can do by hand (or by Excel).

[NEAS: The student project tests if you can apply the time series techniques. We are not testing if you can find the optimal ARIMA model. As we mention on the discussion forum, ARIMA models are proxies for the true explanatory model. The ideal model includes many economic variables that we don’t discuss here. We are not testing if you can force a series into a specific model. You should discuss what the statistical tests indicate about the model, but you are not required to examine all ARIMA models.]


n2thornl
Junior Member
Junior Member (21 reputation)Junior Member (21 reputation)Junior Member (21 reputation)Junior Member (21 reputation)Junior Member (21 reputation)Junior Member (21 reputation)Junior Member (21 reputation)Junior Member (21 reputation)Junior Member (21 reputation)

Group: Forum Members
Posts: 21, Visits: 1

Here's a good one:

Are we even supposed to DO an AR(2) model?  I think assuming we have Excel is fine, but personally I do NOT have any statistical software, not even at work.  I can't even use a free trial, because I used all those up for the various programs getting through grad school.

I feel very confident in saying AR(2) does involve liner regression... the other thing mentioned is actually an ARIMA(1,2,0), that is to say, an AR(1) model using second differences.  So, in order to attempt an AR(2) model, we've got to manage multiple regression!

Has anyone sucessfully managed an AR(2) model using only formulas, and no software?

Has anyone sucessfully managed an AR(2) model using only Excel?  (I admit I haven't yet tried the method JulieC just mentioned)

Jacob: Do we have to examine an AR(2) model? If we are satisfied with an AR(1) model, is that sufficient?

Rachel: That depends on the objective of the student project and the results of the AR(1) model. If we are constructing an ARIMA model, we examine first an AR(1) model. If the residuals are white noise, we can stop here. It is unlikely that all three statistical tests will indicate the residuals are white noise, and we would normally compare an AR(2) model with an AR(1) model.

The student project can focus on other items. It might determine optimal interest rate eras by seeing when a model’s forecasts lose accuracy. We might use an AR(1) model with different periods.

Jacob: Is an AR(2) model more difficult than an AR(1) model?

Rachel: With the regression add-in, the work is the same. Excel does all the calculations; the only difference is choosing two columns for the independent variables, not one column.

 


JulieC
Forum Newbie
Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)Forum Newbie (7 reputation)

Group: Forum Members
Posts: 2, Visits: 1

There is a function in excel that will do multi variable linear regression.  The function is LINEST.  You have to highlight an area that is 3 columns by 5 rows then do the function.  This will give you results for a two variable linear regression.  Use the help function to see what the output is. 

Jacob: Do we need to use the regression add-in? Can we use the linest built-in function to do the regression analysis?

Rachel: The linest built-in function gives us the ordinary least squares estimators. For the time series student projects, we need all the residuals, so that we can compute the Durbin-Watson statistic, Bartlett’s test, and the Box-Pierce Q statistic. The regression add-in gives all the results we need for the time series projects.


JoeyR
Junior Member
Junior Member (11 reputation)Junior Member (11 reputation)Junior Member (11 reputation)Junior Member (11 reputation)Junior Member (11 reputation)Junior Member (11 reputation)Junior Member (11 reputation)Junior Member (11 reputation)Junior Member (11 reputation)

Group: Forum Members
Posts: 11, Visits: 1

I don't want to harp on the topic, but I would have to agree with this statement - maybe that's because I'm also in the consulting field.  The reality of this material is that it is not relevant to what we [I] do.  This is even confirmed in the homework assignments as they make the statement that statisticians use time series differently than actuaries - I would go even further to say that the average actuary doesn't even use time series in their everyday professional lives.

This is my biggest beef with the SOA material and requisites - I find myself working like nuts, then cramming over a short period of time then regurgitating the material on paper and shortly afterwards forgetting everything I "learned".

Something should be done to make the exam process more relevant to actual practice.

I do see it your way too Tuba, however, being stuck between having to be efficient at work and having to obtain my FSA as quickly as possible in order to repossess control over my life again, I've chosen the route of least constraint - that being cram and regurgitate!

Jacob: Some candidates say the syllabus material is not relevant to actuarial work; is that true?

Rachel: NEAS does not determine the syllabus material, and we do not judge this issue. We try to make the material more relevant. We apply the syllabus material to actuarial work. Most time series processes are used for business sales. We show applications to insurance loss trends, interest rates, inflation rates,


Chesters Mom
Junior Member
Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)

Group: Forum Members
Posts: 14, Visits: 1

Tuba:

I did AR(2) both ways (run the add-in with 2 lag vs. using the 3 equations I mentioned) and the results are pretty frustrated.  Since my first-differenced series' sample correlogram has a clear geometrically declining oscillatory pattern, I thought AR(2) would be the answer.  I agree with you that AR(2) most likely means the multivariate regression, but both Bartlett's test stat and Box-Pierce's Q stat come out significant (tests are on the residuals).  I don't know what else to do/try since no MA component was suggested in my first-differenced series' sample correlogram...I am going to start a new post and am really hoping those who used 1945-1978 3-mon T-bill can give me some insight.

Jacob: What do we expect from first differences of interest rates? Do we expect to find an oscillatory pattern?

Rachel: Some oscillatory patterns are real. Others stem from measurement error or rounding.

Illustration: Suppose the starting interest rate is 10.0%, and interest rates have a random walk with a drift of 0.04% a month. For simplicity, assume there is no stochasticity.

The interest rates are 10.00%, 10.04%, 10.08%, 10.12%, 10.16%, 10.20%, 10.24%, 10.28%, 10.32%, 10.36%, 10.40%, and so forth. The first differences are 0.04% each month.

If interest rates are rounded to one decimal place, the interest rates are 10.0%, 10.0%, 10.1%, 10.1%, 10.2%, 10.2%, 10.2%, 10.3%, 10.3%, 10.4%, 10.4%, and so forth. The first differences are 0.0%, 0.1%, 0.0%, 0.1%, 0.0%, 0.0%, 0.1%, 0.0%, 0.1%, 0.0%, and so forth.

The first differences have a spurious oscillatory pattern. It is caused by the rounding rule, and it has nothing to do with the time series process.

Jacob: Is this rounding problem common?

Rachel: This problem is very common, and it causes much of the apparent oscillatory patterns in stock prices and interest rates. The interest rates on the NEAS web site have two decimal places; we expect this type of spurious pattern.

Jacob: What if the drift is zero? Do we still see a spurious oscillatory pattern?

Rachel: A second reason for an oscillatory pattern is inaccurate measurement and other random errors.

Illustration: Suppose the interest rate is 8.0% each month and the stochasticity is small. The interest rate is measured with an error of –0.1% or +0.1% each month. The error need not be measurement error. Other causes of error are random fluctuations in demand for Treasury bills that affect the auction price but do not affect the underlying economics.

We examine the autocorrelation of the first differences. Each month the interest rate is 9.9% or 10.1%. The first difference is 0.0%, +0.2%, or –0.2%. For interest rates in any three month period, we have three possible scenarios:

Month 1

Month 2

Month 3

Difference 1

Difference 2

1

7.9%

7.9%

7.9%

0.0%

0.0%

2

7.9%

7.9%

8.1%

0.0%

0.2%

3

7.9%

8.1%

7.9%

0.2%

-0.2%

4

7.9%

8.1%

8.1%

0.2%

0.0%

5

8.1%

7.9%

7.9%

-0.2%

0.0%

6

8.1%

7.9%

8.1%

-0.2%

0.2%

7

8.1%

8.1%

7.9%

0.0%

-0.2%

8

8.1%

8.1%

8.1%

0.0%

0.0%

The average first difference is zero, since the drift in interest rates is zero. Of the eight scenarios, six have at least one first difference of zero, and two scenarios have the opposite first differences. The autocorrelation is negative.

Jacob: What is the implication for model building?

Rachel: Many oscillatory patterns reflect rounding, measurement error, and random fluctuations, not the underlying time series process. We carefully examine apparent oscillatory patterns. We can not always determine the cause of the oscillatory pattern, but we can suggest possible causes.


NewTubaBoy
Forum Member
Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)

Group: Forum Members
Posts: 25, Visits: 1

Yea, I believe you can use those equations.  There is a lot of ambiguity with an AR(2) model and it mentions it several times in the text.  Does AR(2) mean just a two period lag (regress on one variable) or does it mean a 1 and a two period lag (regress on two variables).  I personally think that it is the multiple regression model and that the book is sometimes sloppy with it's terminology sometimes. 

I don't remember where I got the online program.  I think if you search for multiple regression model or something like that you can find something in Excel online. 

[NEAS: The AR(2) model is a multiple regression equation.]


Chesters Mom
Junior Member
Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)

Group: Forum Members
Posts: 14, Visits: 1

On a second thought, Tuba and everyone struggling for AR(2), I think we can compute by hand...using equations 4.3, 4.4, and 4.5. Afterall, the two-variable regression Excel add-in seems to be based off equation 3.5 (I verified by hand), isn't it?

Jacob: As we work through the student project, should we calculate the ordinary least squares estimators with the equations in the textbook?

Rachel: Calculating β by its formula is useful, since it helps you see what the regression equation is doing, and it helps you code Excel formulas and VBA macros. Do this once or twice with small data sets and verify that the regression add-in gives the same result.

Jacob: Is it useful to compute any other values by formulas?

Rachel: For a multiple regression equation, the independent variables may be correlated, raising the standard error of the ordinary least squares estimators. If a time series is a random walk, the first differences should not be highly correlated, and an AR(2) model should not be distorted by multicollinearity. But an AR(2) model on the random walk entries themselves is distorted by multicollinearity.


Chesters Mom
Junior Member
Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)

Group: Forum Members
Posts: 14, Visits: 1
Tuba--

thanks for pointing out the error in my AR(2) process; I simply assumed a lag of 2 and no wonder all my dignostic checking results came out significant! Would you mind to share that online multivariate regression program with us? I do have SAS at work but I really don't use it very often, and it might take me a while to learn how to do that in SAS.

NewTubaBoy
Forum Member
Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)Forum Member (25 reputation)

Group: Forum Members
Posts: 25, Visits: 1

I think that's exactly right.  If you regress on the original series yt and there is a slope (well... a significant slope) then you have to first difference it.  Then, regress on the differenced series to get the coefficient of your AR(1) model.  To do an AR(2) you have to have a two variable regression... unfortunately the add-in in excel won't do this (because you have to do the regression simultaneously).  I found a program online that does multivariate regression.  If you use SAS at all that will do it for you too.  I wish we were given a program that went a long w/ this class and that procedures we needed to do.

[NEAS: The comment about a significant slope is not correct.  The add-in can handle an AR(2) model well enough for the student project.  Statistical software like SAS can be expensive; we do as much as possible in Excel.

Jacob: If we regress the original series on the same series lagged one period (regress yt on yt-1) and the slope coefficient is significant, should we take first differences?

Rachel: That depends on the slope coefficient.

If 0 < β < 1, the time series process may be autoregressive and stationary with a positive parameter. We do not necessarily take first differences; we examine the residuals for various ARIMA models, such as AR(1), AR(2), MA(1), ARMA(1,1), starting with the most reasonable model, usually AR(1), and testing other models if the simpler ones don’t fit.

If –1 < β < 0, the time series process may be autoregressive, stationary, and oscillatory with a negative parameter. This process is less common but not unusual. We examine the economic, financial, or actuarial relations to understand the negative parameter. The negative β may also stem from measurement error, rounding, or other random errors.

If β . 1, the time series may be a random walk. It is not stationary, but its first differences may be a stationary white noise process. This type of process is common in financial and actuarial work.

If β > 1, we graph the series on a logarithmic scale. If the series appears linear, we regress ln(yt) on ln(yt-1). If β . 1, the logarithm of the time series may be a random walk. This is very common in financial and actuarial work. Stock prices and inflation indices are examples.

If β . –1, or β # –1, the time series is unusual. You should re-check the data before going on.

Jacob: What if β = 0.90? Is that close enough to 1 that the time series is a random walk?

Rachel: That depends on the number of observations. We use tests of significance to see if we can reject the null hypothesis that β = 1.

Jacob: Can we use the Excel regression add-in for an AR(2) model?

Rachel: Yes; choose yt as the dependent variable, and the two independent variables are yt-1 and yt-2.


Chesters Mom
Junior Member
Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)Junior Member (14 reputation)

Group: Forum Members
Posts: 14, Visits: 1

If your original series is stationary then you can regress on original y(t)'s.  But if you need to first difference your series in order to obtain stationarity, then I believe you'll need to regress against the first-differenced values.  AR(2) is similar to AR(1), your Y values are the, well, Y values, and your X values are your Y values lagged 2 periods, I think.

[NEAS: An AR(2) model has two independent variables: the Y values lagged one period and two periods.]


GO
Merge Selected
Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...





Reading This Topic


Login
Existing Account
Email Address:


Password:


Social Logins

  • Login with twitter
  • Login with twitter
Select a Forum....











































































































































































































































Neas-Seminars

Search