程序代写案例-MAY 2019
时间:2021-04-30
MAY 2019 EXAMINATION DIET
SCHOOL OF MATHEMATICS & STATISTICS
MODULE CODE: MT4527
MODULE TITLE: Time Series Analysis
EXAM DURATION: 2 hours
EXAM INSTRUCTIONS: Attempt ALL questions.
The number in square brackets shows the
maximum marks obtainable for that
question or part-question.
Your answers should contain the full
working required to justify your solutions.
A formulae sheet is provided at the end of
the exam.
PERMITTED MATERIALS: Non-programmable calculator
YOU MUST HAND IN THIS EXAM PAPER AT THE END OF THE EXAM
PLEASE DO NOT TURN OVER THIS EXAM PAPER UNTIL YOU ARE
INSTRUCTED TO DO SO.
MT4527 May 2019, Page 1 of 12
1. (a) Explain in simple words what the state-space of a stochastic process Xt
is. [1]
(b) Consider a random walk Xt = Xt1 + "t, t = 0, 1, 2, . . ., with initial value
X0 = 0. Here, "t is a white noise stochastic process.
(i) Derive the expectation, E(Xt), for all t = 1, 2, 3, . . . [1]
(ii) Derive the variance, V ar(Xt), for all t = 1, 2, 3, . . . [2]
(iii) Derive the covariance, Cov(Xt, Xt+h), for all h = 0, 1, 2, . . .. Hence,
derive the correlation Corr(Xt, Xt+h), for all h = 0, 1, 2, . . .. [3]
(c) Consider a Simple Moving Average (SMA) estimator for the trend compo-
nent of a non-seasonal model with trend. The choice of span q is ad hoc,
and subject to the trade-o↵ between two of the estimator’s properties.
Name those two properties. [2]
2. Consider a stationary AR(p) process with general form,
Xt = 1Xt1 + 2Xt2 + · · ·+ pXtp + "t, t = 0,±1,±2, . . .
where "t denotes a white noise stochastic process. Derive the Yule-Walker
equations for the calculation of the autocorrelation sequence for the AR(p)
process, including the initial conditions. [3]
MT4527 May 2019, Page 2 of 12
3. Let "t denote a white noise process. Consider the AR(2) process {Xt} given
by,
Xt +
1
3
Xt1 2
9
Xt2 = "t.
(a) Show that this stochastic process is stationary by considering and solving
the relevant equation (x) = 0, where (x) represents the AR operator in
a standard ARMA model. [1]
(b) Substitute the general expression for the MA(1) process into the defining
equation of the AR process, and show that the coecients of the MA(1)
process satisfy a recurrence relation, specifying the recurrence relation and
the initial conditions. [4]
(c) Hence, calculate the coecients of the infinite MA representation. [4]
(d) Assume without proof that, for L > 0, the minimum mean square error
forecast xˆt(L), based at origin t, of Xt+L satisfies the equation
Xt+L = ⌫0"t+L + ⌫1"t+L1 + . . .+ ⌫L1"t+1 + xˆt(L).
Use this equation to show that xˆt(L) equals the conditional mean of Xt+L
given the information available at time t. Show also that the corresponding
conditional variance of Xt+L equals 2"

⌫20 + ⌫
2
1 + . . .+ ⌫
2
L1

. [4]
(e) Suppose now that the random shocks are Normally distributed and that,
after 80 observations have been seen, the point forecast of X82 made at
origin t = 80 is xˆ80(2) = 60. Evaluate the 95% prediction interval for X82
made at the same origin, if the estimate of the white noise variance is
s2" = 81. You may find the information below helpful:
> qnorm(0.975)
[1] 1.959964
[3]
MT4527 May 2019, Page 3 of 12
4. A statistician is asked to model the changes in temperature (measured in the
Farhenheit scale) during a chemical experiment. He has a time series x of 200
measurements taken every hour. For example, a measurement of xt = 10
describes a drop of 10 oF from hour t 1 to hour t.
(a) The statistician plots the original time series (x) as well as the first (rx),
the second (r2x) and the third (r3x) order di↵erenced series - see plot
below. Decide based on the plot the likely minimum order of di↵erencing,
necessary to obtain a stationary time series. Briefly justify your answer. [2]
0 50 100 150 200
−4
0
−2
0
0
Time
x
Original time series
0 50 100 150 200
−4
0
2
4
Time

x
First order differences
0 50 100 150 200
−2
0
2
4
Time

2 x
Second order differences
0 50 100 150 200
−6
−2
2
6
Time

3 x
Third order differences
(b) The statistician decides to fit an ARIMA(0, 1, 2) model using R. The fol-
lowing output is produced:
Call:
arima(x = temp, order = c(0, 1, 2))
Coefficients:
ma1 ma2
0.5074 0.8856
s.e. 0.0351 0.0367
sigma^2 estimated as 1.143: log likelihood = -298.72, aic = 603.43
MT4527 May 2019, Page 4 of 12
The statistician wants to use this model for forecasting. Write down,
without using any backwards operators, an explicit di↵erence equation for
Xt+L, where L = 1, 2, 3, . . . . [2]
(c) Recall that the minimum mean square error forecast xˆt(L) for Xt+L, based
at origin t, equals the conditional mean of Xt+L given the information
available at time t. Hence write down, without proof, a set of di↵erence
equations giving (for L = 1, 2, 3, ..) the forecast xˆt(L) of Xt+L. [3]
(d) Let the forecast origin be t = 200. The observed values for the temperature
changes xt, for t = 198, 199, 200, together with the corresponding estimated
random shocks ✏t, were as follows:
day t xt ✏t
198 -22.24 0.036
199 -24.54 -0.600
200 -25.03 -0.220
Calculate the minimum mean square error forecast of the change in tem-
perature for t = 202. [4]
(e) Consider the ARIMA(0,1,2) model (1 B)Xt = (1 + 0.5B + 0.9B2)"t.
Express the (Xt)t2Z process in random shock form. [4]
MT4527 May 2019, Page 5 of 12
5. A (G)ARCH model was fitted to the first order di↵erenced series (rx) of the
observations on the changes in temperature, as described in Question 4 above.
Coefficient(s):
Estimate Std. Error t value Pr(>|t|)
a0 1.723e+00 3.335e+00 0.517 0.605
a1 1.519e-01 1.061e-01 1.432 0.152
a2 4.519e-02 2.806e-01 0.161 0.872
b1 9.293e-15 1.840e+00 0.000 1.000
Diagnostic Tests:
Jarque Bera Test
data: Residuals
X-squared = 1.8717, df = 2, p-value = 0.3923
Box-Ljung test
data: Squared.Residuals
X-squared = 0.065631, df = 1, p-value = 0.7978
Answer the following questions:
(a) Which type of (G)ARCH model has been fitted? [1]
(b) Write down the volatility equation of the fitted model. [1]
(c) Considering the fitted model, is the variance of the white noise process
finite? If so, compute an estimate for this variance. [2]
(d) Given the provided R output, comment on whether the fitted model is
appropriate for the (rx) data. [3]
MT4527 May 2019, Page 6 of 12
Formulae Sheet
General solutions of first and second order homogeneous recur-
rence relations
• First order homogeneous recurrence relation
uk+1 uk = 0
for k = 0, 1, . . . .
Solution:
uk = A
k, where A is a constant
• Second order homogeneous recurrence relation
uk+2 + c1uk+1 + c2uk = 0
for k = 0, 1, . . . with auxiliary equation:
x2 + c1x+ c2 = 0
Solution
– when the roots of the auxiliary equation, 1 and 2, are distinct:
uk = A1
k
1 + A2
k
2, where A1, A2 are constants
– when the auxiliary equation has a double root :
uk = (A1 + kA2)
k where A1, A2 are constants
MT4527 May 2019, Page 7 of 12
MT4527: Time Series - Solutions
1. (a) The state space of Xt is the set of values that the random variables Xt
may take. [1]
[EASY - bookwork]
(b) (i) For a random walk Xt, without drift and initial value X0 = 0,
Xt = "t + "t1 + ...+ "1
with " a white noise. Then,
E(Xt) = E("t) + E("t1) + ...+ E("1) = 0 + 0 + ...+ 0 = 0
[1]
(ii) For the variance of Xt,
V ar(Xt) = E(X
2
t ) = E("
2
t ) + E("
2
t1) + ...+ E("
2
1) + 0 = t
2
"
because E("t"s) = 0 for t 6= s. [2]
(iii) Using the results above,
Cov(Xt, Xt+h) = E(XtXt+h) = E("
2
t )+E("
2
t1)+...+E("
2
1)+0 =
2
"t
Also, V ar(Xt+h) = (t+ h)2" . Then the correlation Corr(Xt, Xt+h)
is
Corr(Xt, Xt+h) =
Cov(Xt, Xt+h)p
V ar(Xt)V ar(Xt+h)
=
tp
t(t+ h)
=
r
t
t+ h
[3]
[MEDIUM - Seen in tutorial]
(c) The choice of the span q is subject to the trade-o↵ between a faster reaction
to trend changes and the variance of the trend estimator. [2]
[EASY - bookwork]
MT4527 May 2019, Page 8 of 12
2.
E(XtXtk) = 1E(Xt1Xtk) + . . .pE(XtpXtk) + E("tXtk).
In the stationarity scenario, all these random variables have mean zero. Also,
Xtk is not correlated to "t as a function of {"tk, "tk1, . . . }. So, we obtain,
k = 1k1 + . . .pkp,
and dividing with 0,
⇢k = 1⇢k1 + . . .p⇢kp, k = 1, 2, 3, . . .
The initial conditions are expressed as ⇢0 = 1 and ⇢k = ⇢k for k = 1, 2, . . . , p
1. [3]
[EASY to MEDIUM - bookwork]
3. (a) The roots of the equation
(x) = 1 +
1
3
x 2
9
x2 = 0 , 2x2 3x 9 = 0
are x1 = 3 and x2 = 3/2. Both lie outside the unit circle so the AR
process is indeed stationary.
[EASY to MEDIUM - seen similar] [1]
(b) If we substituteXt =
P1
i=0 ✓i"ti, into the defining equation of the process,
we get
1X
i=0
✓i"ti +
1
3
1X
i=0
✓i"t1i 2
9
1X
i=0
✓i"t2i = "t
✓0"t + (✓1 +
1
3
✓0)"t1 +
1X
i=0
(✓i+2 +
1
3
✓i+1 2
9
✓i)"t2i = "t
Comparing the coecients of "ti we get
(("t)) ✓0 = 1
(("t1)) ✓1 + 13✓0 = 0 ) ✓1 = 13
(("ti2)) ✓i+2 + 13✓i+1 29✓i = 0 i = 0, 1, 2, . . . .
[MEDIUM - seen similar]
[4]
MT4527 May 2019, Page 9 of 12
(c) The auxiliary equation of this second order homogeneous recurrence rela-
tion is:
x2 +
1
3
x 2
9
= 0,
the roots of which are the reciprocals of the roots of
(x) = 0, namely x = 1/3 or 2/3.
Hence the general solution of the recurrence relation is
✓i = A1

1
3
◆i
+ A2

2
3
◆i
i = 0, 1, 2, . . . .
Now ✓0 = 1 ) 1 = A1 + A2
✓1 = 13 ) 13 = 13A1 23A2) A1 = 1/3, A2 = 2/3.
So the infinite MA representation has coecients
✓i =
1
3

1
3
◆i
+
2
3

2
3
◆i
i = 0, 1, 2, . . .
[4]
[MEDIUM - seen similar]
(d) Taking conditional expectations at time t through the given equation, we
have
Et(Xt+L) = ⌫0Et("t+L) + ⌫1Et("t+L1) + . . .+ ⌫L1Et("t+1) + Et[xˆt(L)] = xˆt(L),
since the conditional means of future shocks are their unconditional means,
namely zero, and the forecast xˆt(L) is a known constant at time t.
Similarly, taking conditional variances at time t through the given equa-
tion, we have
Vart(Xt+L) = Vart(⌫0"t+L) + Vart(⌫1"t+L1) + . . .+Vart(⌫L1"t+1) + Vart[xˆt(L)]
= ⌫20
2
" + ⌫
2
1
2
" + . . .+ ⌫
2
L1
2
" . Hence the result. [2]
[4]
[MEDIUM - seen similar in lecture]
(e) Using this result, the variance of X82, conditional on the information avail-
able at t = 80, is Vart(X82) = 2"(⌫
2
0 + ⌫
2
1) =
2
"
n
1 +
1
3
2o
= 109
2
" .
Approximating the t distribution by N(0, 1), the 95% prediction inter-
val for X82, based at origin t = 80, has end-points 60 ± (1.96)s"
q
10
9 =
60± 18.59 = 41.4 and 78.6. [3]
[HARD - students will have to remember the interpretation of the ⌫ con-
stants and show initiative]
MT4527 May 2019, Page 10 of 12
4. (a) The original series has no fixed level, which is an indication of non-
stationarity. The first order di↵erenced time series appears stationary.
[2]
[EASY - seen similar]
(b) The di↵erence equation of the ARIMA (0,1,2) is,
(1 B)Xt+L = "t+L + 0.5074"t+L1 + 0.8856"t+L2 )
Xt+L Xt+L1 = "t+L + 0.5074"t+L1 + 0.8856"t+L2
[2]
[EASY to MEDIUM - application of bookwork]
(c) Taking conditional expectations at time t yields,8<: when L = 1 xˆt(1) = xt + 0.5074✏t + 0.8856✏t1,when L = 2 xˆt(2) = xˆt(1) + 0.8856✏t,
when L 3 xˆt(L) = xˆt(L 1)
[3]
[MEDIUM - seen similar examples but not of this ARIMA model]
(d) The forecast for t = 201 is,
xˆ200(1) = x200 + 0.5074✏200 + 0.8856✏199 =
= 25.03 + 0.5074(0.22) + 0.8856(0.6) = 25.67
Using this result the forecast for t = 202 is,
xˆ200(2) = xˆ200(1) + 0.8856✏200 = 25.67 + 0.8856(0.22) = 25.86
[HARD - have seen a simpler example] [4]
(e)
Xt = (1B)1(1+0.5B+0.9B2)"t = (1+B+B2+. . . )(1+0.5B+0.9B2)"t
= [(1+B+B2+B3 . . . )+(0.5B+0.5B2+0.5B3+. . . )+(0.9B2+0.9B3+. . . )]"t
= (1 + 1.5B + 2.4B2 + 2.4B3 + . . . )"t
= "t + 1.5"t1 + 2.4"t2 + 2.4"t3 + . . .
[4]
[MEDIUM - seen slightly simpler examples]
MT4527 May 2019, Page 11 of 12
5. (a) GARCH(1,2). [1]
[EASY - seen similar]
(b) The volatility equation is
2t = 1.723 + 0.1519"
2
t1 + 0.045"
2
t2 + 1.8⇥ 10152t1
[1]
[EASY - application of bookwork]
(c) The variance of a GARCH(1,2) model is finite if ↵1 + ↵2 + 1 < 1. Here
↵ˆ1 + ↵ˆ2 + ˆ1 h 0.19 < 1. Consequently the variance is finite. An estimate
is,
↵ˆ0
1 ↵ˆ1 ↵ˆ2 ˆ1
=
1.723
(1 0.19709) = 2.1459
[2]
[MEDIUM - seen similar]
(d) The pvalue of the Ljung-Box test is greater than any standard significant
level so the null hypothesis of uncorrelated residuals is not rejected. Also,
the Null hypothesis of normal residuals is not rejected at any standard sig-
nificance level. Consequently the normality assumption of the innovations
could be appropriate. However, the tests on the significance of the model
coecients (each with null hypothesis that the coecient is zero) all show
very large p-values. This is an indication that there is no GARCH-signal
in the data. One should be cautious though, as each test is conducted as-
suming that all other terms are present in the model. So, some coecients
may be significantly di↵erent to zero in the absence of other coecients. [3]
[HARD - seen similar for the residuals assumptions but not for the model
coecients]
END OF PAPER
MT4527 May 2019, Page 12 of 12


































































































































































































































































































































































































































































































































































































































































































































































































































































































































学霸联盟


essay、essay代写