xuebaunion@vip.163.com

3551 Trousdale Rkwy, University Park, Los Angeles, CA

留学生论文指导和课程辅导

无忧GPA：https://www.essaygpa.com

工作时间：全年无休-早上8点到凌晨3点

扫码添加客服微信

扫描添加客服微信

程序代写案例-MATH380201

时间：2021-04-17

Module Code: MATH380201

1. (a) Consider the time series model

Xt = µ(t) + s(t) + εt.

Assuming the standard notation used in this module, what do each of the terms

Xt, µ(t), s(t) and εt represent?

In a plot of Xt against t, what features would you look for to determine whether

the terms µ(t) and s(t) are required?

Explain why µ(t) and s(t) are functions of t, whilst t is a subscript in Xt and εt.

(b) Quarterly sales of videos in the Leeds “Disney” store are shown in figure 1. Below

is the code and output for an analysis of these data in R, with the sales data stored

in the time series object X.

Explain what is being done at points (i)–(iv) in the R code. Explain what is the

difference between (v) and (vi) in the R code. Explain, giving reasons, which of

(v) and (vi) is preferable. Write out the model with estimated parameters in full.

(The relevant points in the R code are denoted #### (i) #### etc.)

Given that the sales for the four quarters of 2018 were 721, 935, 649, and 1071,

use model-based forecasting to predict sales for the first quarter of 2019. (A point

forecast is sufficient; you do not need to calculate a prediction interval.)

Suggest one change to the fitted model which would improve the analysis. (You

can assume that the choice of stochastic process at (v) in the R code is the correct

one for these data.)

Time

Sa

le

s

2010 2012 2014 2016 2018

10

00

15

00

20

00

Figure 1: Quarterly video sales in Leeds “Disney” store.

Page 2 of 6 Turn the page over

Module Code: MATH380201

> tt = 1:32

> trend.lm = lm(sales ~ tt) #### (i) ####

> summary(trend.lm)

Coefficients:

Estimate Std. Error t value Pr(>|t|)

(Intercept) 2107.220 57.997 36.33 < 2e-16 ***

tt -43.500 3.067 -14.18 7.72e-15 ***

> trend = ts(fitted(trend.lm), start=start(sales), freq=frequency(sales))

> X = sales - trend #### (ii) ####

> q1 = as.numeric((1:32 %% 4) == 1)

> q2 = as.numeric((1:32 %% 4) == 2)

> q3 = as.numeric((1:32 %% 4) == 3)

> q4 = as.numeric((1:32 %% 4) == 0)

> season.lm = lm(resid(trend.lm) ~ 0 + q1 + q2 + q3 + q4) #### (iii) ####

> summary(season.lm)

Coefficients:

Estimate Std. Error t value Pr(>|t|)

q1 -38.41 43.27 -0.888 0.38232

q2 18.80 43.27 0.435 0.66719

q3 -134.78 43.27 -3.115 0.00422 **

q4 154.38 43.27 3.568 0.00132 **

> season = ts(fitted(season.lm), start=start(sales), freq=frequency(sales))

> Y = X - season #### (iv) ####

> ar(Y, aic=FALSE, order.max=1) #### (v) ####

Coefficients:

1

0.5704

Order selected 1 sigma^2 estimated as 9431

> ar(Y, aic=FALSE, order.max=2) #### (vi) ####

Coefficients:

1 2

0.5574 0.0105

Order selected 2 sigma^2 estimated as 9437

Page 3 of 6 Turn the page over

Module Code: MATH380201

2. Let {Xt} be a moving average process of order q (usually written as MA(q)) defined on

t ∈ Z as

Xt = εt + β1εt−1 + · · ·+ βqεt−q, (1)

where {εt} is a white noise process with variance 1.

(a) Show that for any MA(1) process with β1 6= 1 there exists another MA(1) pro-

cess with the same autocorrelation function, and find the lag 1 moving average

coefficient (β′1 say) of this process.

(b) For an MA(2) process, equation (1) becomes

Xt = εt + β1εt−1 + β2εt−2. (2)

i. Define the backshift operator B, and write equation (2) in terms of a polyno-

mial function β(B), giving a clear definition of this function.

ii. Hence show that equation (2) can be written as an infinite order autoregressive

process under certain conditions on β(B), clearly stating these conditions.

Page 4 of 6 Turn the page over

Module Code: MATH380201

3. Let {Xt} be an autoregressive process of order one, usually written as AR(1).

(a) Write down an equation defining Xt in terms of an autoregression coefficient α

and a white noise process {εt} with variance σ2ε .

Explain what the phrase “{εt} is a white noise process with variance σ2ε” means.

(b) Derive expressions for the variance γ0 and the autocorrelation function ρk, k =

0, 1, . . . of the {Xt} in terms of σ2ε and α.

Use these expressions to suggest an estimate of α in terms of the sample autocor-

relations {ρˆk}.

(c) Suppose that only every second value of Xt is observed, resulting in a time series

Yt = X2t, t = 1, 2, . . ..

Show that {Yt} forms an AR(1) process. Find its autoregression coefficient, say

α′, and the variance of the underlying white noise process, in terms of α and σ2ε .

(d) Given a time series data set X1, . . . , X256 with sample mean x = 9.23 and sample

autocorrelations ρˆ1 = −0.6, ρˆ2 = 0.36, ρˆ3 = −0.22, ρˆ4 = 0.13, ρˆ5 = −0.08,

estimate the autoregression coefficients α and α′ of {Xt} and {Yt}.

Page 5 of 6 Turn the page over

Module Code: MATH380201

4. (a) Given data X1, . . . , Xn, let Xn(l) be the l-step ahead forecast of Xn+l based on a

(possibly infinite) MA model for {Xt}. Show that minimising

E{[Xn+l −Xn(l)]2}

leads to a forecast of the form

Xn(l) =

∞∑

j=l

βjεn+l−j,

where {εt} is a white noise process, stating how the βj might be found.

(b) i. Define difference operators ∇, and write ∇ in terms of the backshift operator

B.

ii. Define autoregressive and moving average process of order (p, q) (usually

written as ARMA(p, q)) and autoregressive integrated moving average pro-

cess of order (p, d, q) (usually written as ARIMA(p, d, q)) and show how an

ARIMA(p, d, q) process can be written as an ARMA(p′, q′) process, giving the

AR and MA orders of this process.

iii. Hourly data were gathered on the bacterial infection of a sample of food under

controlled conditions. These data are denoted by Xt, t = 1, . . . , 478.

Summary statistics for {Xt} and {Yt}, where Yt = ∇Xt, are presented below.

Here, ρˆk and αˆkk are respectively the sample autocorrelation and sample partial

autocorrelation coefficients at lag k.

Identify a suitable model for this time series and estimate the parameters of

your model.

Xt : x¯ = 443.7, sx = 341.3.

k 1 2 3 4 5 6 7 8 9 10

ρˆk 0.995 0.991 0.986 0.981 0.976 0.970 0.965 0.959 0.952 0.946

αˆkk 0.995 0.005 −0.022 −0.018 −0.024 −0.031 −0.013 −0.042 −0.027 −0.007

Yt : y¯ = 1.830, sy = 17.17.

k 1 2 3 4 5 6 7 8 9 10

ρˆk −0.019 0.027 0.076 0.058 −0.022 −0.032 0.045 −0.047 −0.014 0.007

αˆkk −0.019 0.026 0.077 0.060 −0.024 −0.043 0.036 −0.043 −0.010 0.007

Page 6 of 6 End.

学霸联盟

1. (a) Consider the time series model

Xt = µ(t) + s(t) + εt.

Assuming the standard notation used in this module, what do each of the terms

Xt, µ(t), s(t) and εt represent?

In a plot of Xt against t, what features would you look for to determine whether

the terms µ(t) and s(t) are required?

Explain why µ(t) and s(t) are functions of t, whilst t is a subscript in Xt and εt.

(b) Quarterly sales of videos in the Leeds “Disney” store are shown in figure 1. Below

is the code and output for an analysis of these data in R, with the sales data stored

in the time series object X.

Explain what is being done at points (i)–(iv) in the R code. Explain what is the

difference between (v) and (vi) in the R code. Explain, giving reasons, which of

(v) and (vi) is preferable. Write out the model with estimated parameters in full.

(The relevant points in the R code are denoted #### (i) #### etc.)

Given that the sales for the four quarters of 2018 were 721, 935, 649, and 1071,

use model-based forecasting to predict sales for the first quarter of 2019. (A point

forecast is sufficient; you do not need to calculate a prediction interval.)

Suggest one change to the fitted model which would improve the analysis. (You

can assume that the choice of stochastic process at (v) in the R code is the correct

one for these data.)

Time

Sa

le

s

2010 2012 2014 2016 2018

10

00

15

00

20

00

Figure 1: Quarterly video sales in Leeds “Disney” store.

Page 2 of 6 Turn the page over

Module Code: MATH380201

> tt = 1:32

> trend.lm = lm(sales ~ tt) #### (i) ####

> summary(trend.lm)

Coefficients:

Estimate Std. Error t value Pr(>|t|)

(Intercept) 2107.220 57.997 36.33 < 2e-16 ***

tt -43.500 3.067 -14.18 7.72e-15 ***

> trend = ts(fitted(trend.lm), start=start(sales), freq=frequency(sales))

> X = sales - trend #### (ii) ####

> q1 = as.numeric((1:32 %% 4) == 1)

> q2 = as.numeric((1:32 %% 4) == 2)

> q3 = as.numeric((1:32 %% 4) == 3)

> q4 = as.numeric((1:32 %% 4) == 0)

> season.lm = lm(resid(trend.lm) ~ 0 + q1 + q2 + q3 + q4) #### (iii) ####

> summary(season.lm)

Coefficients:

Estimate Std. Error t value Pr(>|t|)

q1 -38.41 43.27 -0.888 0.38232

q2 18.80 43.27 0.435 0.66719

q3 -134.78 43.27 -3.115 0.00422 **

q4 154.38 43.27 3.568 0.00132 **

> season = ts(fitted(season.lm), start=start(sales), freq=frequency(sales))

> Y = X - season #### (iv) ####

> ar(Y, aic=FALSE, order.max=1) #### (v) ####

Coefficients:

1

0.5704

Order selected 1 sigma^2 estimated as 9431

> ar(Y, aic=FALSE, order.max=2) #### (vi) ####

Coefficients:

1 2

0.5574 0.0105

Order selected 2 sigma^2 estimated as 9437

Page 3 of 6 Turn the page over

Module Code: MATH380201

2. Let {Xt} be a moving average process of order q (usually written as MA(q)) defined on

t ∈ Z as

Xt = εt + β1εt−1 + · · ·+ βqεt−q, (1)

where {εt} is a white noise process with variance 1.

(a) Show that for any MA(1) process with β1 6= 1 there exists another MA(1) pro-

cess with the same autocorrelation function, and find the lag 1 moving average

coefficient (β′1 say) of this process.

(b) For an MA(2) process, equation (1) becomes

Xt = εt + β1εt−1 + β2εt−2. (2)

i. Define the backshift operator B, and write equation (2) in terms of a polyno-

mial function β(B), giving a clear definition of this function.

ii. Hence show that equation (2) can be written as an infinite order autoregressive

process under certain conditions on β(B), clearly stating these conditions.

Page 4 of 6 Turn the page over

Module Code: MATH380201

3. Let {Xt} be an autoregressive process of order one, usually written as AR(1).

(a) Write down an equation defining Xt in terms of an autoregression coefficient α

and a white noise process {εt} with variance σ2ε .

Explain what the phrase “{εt} is a white noise process with variance σ2ε” means.

(b) Derive expressions for the variance γ0 and the autocorrelation function ρk, k =

0, 1, . . . of the {Xt} in terms of σ2ε and α.

Use these expressions to suggest an estimate of α in terms of the sample autocor-

relations {ρˆk}.

(c) Suppose that only every second value of Xt is observed, resulting in a time series

Yt = X2t, t = 1, 2, . . ..

Show that {Yt} forms an AR(1) process. Find its autoregression coefficient, say

α′, and the variance of the underlying white noise process, in terms of α and σ2ε .

(d) Given a time series data set X1, . . . , X256 with sample mean x = 9.23 and sample

autocorrelations ρˆ1 = −0.6, ρˆ2 = 0.36, ρˆ3 = −0.22, ρˆ4 = 0.13, ρˆ5 = −0.08,

estimate the autoregression coefficients α and α′ of {Xt} and {Yt}.

Page 5 of 6 Turn the page over

Module Code: MATH380201

4. (a) Given data X1, . . . , Xn, let Xn(l) be the l-step ahead forecast of Xn+l based on a

(possibly infinite) MA model for {Xt}. Show that minimising

E{[Xn+l −Xn(l)]2}

leads to a forecast of the form

Xn(l) =

∞∑

j=l

βjεn+l−j,

where {εt} is a white noise process, stating how the βj might be found.

(b) i. Define difference operators ∇, and write ∇ in terms of the backshift operator

B.

ii. Define autoregressive and moving average process of order (p, q) (usually

written as ARMA(p, q)) and autoregressive integrated moving average pro-

cess of order (p, d, q) (usually written as ARIMA(p, d, q)) and show how an

ARIMA(p, d, q) process can be written as an ARMA(p′, q′) process, giving the

AR and MA orders of this process.

iii. Hourly data were gathered on the bacterial infection of a sample of food under

controlled conditions. These data are denoted by Xt, t = 1, . . . , 478.

Summary statistics for {Xt} and {Yt}, where Yt = ∇Xt, are presented below.

Here, ρˆk and αˆkk are respectively the sample autocorrelation and sample partial

autocorrelation coefficients at lag k.

Identify a suitable model for this time series and estimate the parameters of

your model.

Xt : x¯ = 443.7, sx = 341.3.

k 1 2 3 4 5 6 7 8 9 10

ρˆk 0.995 0.991 0.986 0.981 0.976 0.970 0.965 0.959 0.952 0.946

αˆkk 0.995 0.005 −0.022 −0.018 −0.024 −0.031 −0.013 −0.042 −0.027 −0.007

Yt : y¯ = 1.830, sy = 17.17.

k 1 2 3 4 5 6 7 8 9 10

ρˆk −0.019 0.027 0.076 0.058 −0.022 −0.032 0.045 −0.047 −0.014 0.007

αˆkk −0.019 0.026 0.077 0.060 −0.024 −0.043 0.036 −0.043 −0.010 0.007

Page 6 of 6 End.

学霸联盟