代写-HW2
时间:2021-11-05
HW2 Solutions
Problem 1
We will answer part (a) and part (b) together. From Example 1.18, we know that the population autocor-
relation function ρ(h) is:
ρ(h) =
{
1, h = 0
0, |h| 6= 0
We simulate the cases for n = 500 and n = 50 as the following.
library(ggplot2)
## Warning: As of rlang 0.4.0, dplyr must be at least version 0.8.0.
## * dplyr 0.7.6 is too old for rlang 0.4.5.
## * Please update dplyr to the latest version.
## * Updating packages on Windows requires precautions:
##
#population ACF
LAG = c(0:20) # lag up to 20
acf.p = c(1, rep(0,20)) # population acf (as above)
ACF.df.p = data.frame(lag = LAG, acf = acf.p) # turn the above into data frame
# n=500 gaussian white noise ACF
w = rnorm(500,0,1) # n=500 gaussian white noise
ACF.500 = acf(w, lag.max = 20, plot = FALSE)
ACF.df.500 = with(ACF.500, data.frame(lag, acf))
# n=50 gaussian white noise ACF
w = rnorm(50,0,1) # n=500 gaussian white noise
ACF.50 = acf(w, lag.max = 20, plot = FALSE)
ACF.df.50 = with(ACF.50, data.frame(lag, acf))
#combine the acfs (population, n=500, n=50)
dat = data.frame(rbind(ACF.df.500, ACF.df.50, ACF.df.p),
n = c(rep("n=500", 21), rep("n=50", 21), rep("population", 21)))
# comparsion
ggplot(data = dat, mapping = aes(x = lag, y = acf)) +
geom_hline(aes(yintercept = 0)) +
facet_grid(n ~ . )+
geom_segment(mapping = aes(xend = lag, yend = 0))
1
n
=50
n
=500
population
0 5 10 15 20
−0.25
0.00
0.25
0.50
0.75
1.00
−0.25
0.00
0.25
0.50
0.75
1.00
−0.25
0.00
0.25
0.50
0.75
1.00
lag
a
cf
As we can see from above comnparison that as the sample size n increses from n = 50 to n = 500, the
sample autocorrelation function gets close to its popoulation counterpart, i.e.,
ρˆn(h)→ ρ(h) n→∞
Problem 2
We answer part (a) and (b) together. Assume that (wt : t ∈ N) is a white noise with mean zero and varaince
σ2w, consider replacing wt by an average of its current value and its immediate neighbors in the past and
future. That is, let vt = (wt+1 + wt + wt−1)/3. We want to simulate a sample of n = 50 and n = 500 vt,
and we want to compare the sample autocorrelation functions ρˆn(h) with the population counterpart ρ(h).
Note that it is easy to show that the population autocorrelation function is:
ρv(h) =

1 h = 0
2/3 |h| = 1
1/3 |h| = 2
0 |h| > 2
LAG = c(0:20) # lag up to 20
acf.p = c(1, 2/3, 1/3, rep(0,18)) # acf based on the above derivation
ACF.df.p = data.frame(lag = LAG, acf = acf.p) # turn the above into data frame
w = rnorm(502,0,1) # 500 N(0,1) variates
v = filter(w, sides=2, rep(1/3,3))[-c(1, 502)] # moving average
2
ACF = acf(v, lag.max = 20, plot = FALSE)
ACF.df.500 = with(ACF, data.frame(lag, acf))
w = rnorm(52,0,1) # 50 N(0,1) variates
v = filter(w, sides=2, rep(1/3,3))[-c(1, 52)] # moving average
ACF = acf(v, lag.max = 20, plot = FALSE)
ACF.df.50 = with(ACF, data.frame(lag, acf))
dat = data.frame(rbind(ACF.df.500, ACF.df.50, ACF.df.p),
n = c(rep("n=500", 21), rep("n=50", 21), rep("population", 21)))
# comparsion
ggplot(data = dat, mapping = aes(x = lag, y = acf)) +
geom_hline(aes(yintercept = 0)) +
facet_grid(n ~ . )+
geom_segment(mapping = aes(xend = lag, yend = 0))
n
=50
n
=500
population
0 5 10 15 20
0.0
0.5
1.0
0.0
0.5
1.0
0.0
0.5
1.0
lag
a
cf
As we can see from above comnparison that as the sample size n gets larger the sample autocorrelation
function gets close to its popoulation counterpart, i.e.,
ρˆn(h)→ ρ(h) n→∞
3
Problem 3
Let (wt : t ∈ Z) ∼ IIDN(0, σ2) and consider the series (xt : t ∈ Z) such that
xt = 0.5wt−1 + wtwt−2.
Determine the mean and autocovariance function of xt, and determine whether it is stationary.
E[xt] = E[0.5wt−1 + wtwt−2] = 0.5E[wt−1] + E[wt]E[wt−2] = 0 + 0× 0 = 0
we can seperate the expectation because the white noise at different time points are independent.
γx(h) = Cov(xt, xt+h)
= E[(0.5wt−1 + wtwt−2)(0.5wt+h−1 + wt+hwt+h−2)]
= (.5)2E[wt+h−1wt−1] + 0.5E[wt+h−1wtwt−2] + 0.5E[wt+hwt+h−2wt−1] + E[wtwt−2wt+hwt+h−2]
=
{
0.25σ2 + σ4, h = 0
0, |h| 6= 0
hence xt is stationary.
Problem 4
xt = µ+ wt + 0.25wt−3 with wt ∼ IIDN(0, 1), thus has autocovariance function
γ(h) =

1.0625 for h = 0
0 for h = ±1
0 for h = ±2
0.25 for h = ±3
0 for |h| > 3
An unbiased estimator of µ is
µˆ = 1
n
n∑
t=1
xt
and its standard error is
se(µˆ)2 = 1
n
n∑
h=−n
[(
1− |h|
n
)
γ(h)
]
= 2516n −
3
2n2
Confidence interval for µ
µˆ± zα/2se(µˆ)
= 1
n
n∑
t=1
xt ± zα/2

25
16n −
3
2n2
4
Problem 5
A time series with a periodic component can be constructed from
xt = U1 sin(2piw0t) + U2 cos(2piw0t),
where U1 and U2 are independent random variables with zero means and E[U1] = E[U2] = σ2. The constant
w0 determines the period or time it takes the process to make one complete cycle.
To show the stationarity of xt, we first need to check the mean function, i.e.,
µt = E[xt] = E [U1 sin(2piw0t) + U2 cos(2piw0t)]
= sin(2piw0t)E[U1]︸ ︷︷ ︸
=0
+cos(2piw0t)E[U2]︸ ︷︷ ︸
=0
= 0, which is independent of time t.
next, we need to compute the autocovariance function:
γx(t, t+ h) =Cov(xt, xt+h)
=Cov(U1 sin(2piw0t) + U2 cos(2piw0t), U1 sin(2piw0[t+ h]) + U2 cos(2piw0[t+ h]))
= sin(2piw0t) sin(2piw0[t+ h])Cov(U1, U1)︸ ︷︷ ︸
=σ2
+ sin(2piw0t) cos(2piw0[t+ h])Cov(U1, U2)︸ ︷︷ ︸
=0
+ cos(2piw0t) sin(2piw0[t+ h])Cov(U2, U1)︸ ︷︷ ︸
=0
+ cos(2piw0t) cos(2piw0[t+ h])Cov(U2, U2)︸ ︷︷ ︸
=σ2
={sin(2piw0t) sin(2piw0[t+ h]) + cos(2piw0t) cos(2piw0[t+ h])}σ2
Using the identity cos(α± β) = cos(α) cos(β)∓ sin(α) sin(β)
= cos (2piw0[t+ h]− 2piw0t)σ2
=cos(2piw0h)σ2
since the autocovariance function is also independent of time t, the series xt is stationary.
5














#population ACF
LAG = c(0:20) # lag up to 20
acf.p = c(1, rep(0,20)) # population acf (as above)
ACF.df.p = data.frame(lag = LAG, acf = acf.p) # turn the above into data frame
# n=500 gaussian white noise ACF
w = rnorm(500,0,1) # n=500 gaussian white noise
ACF.500 = acf(w, lag.max = 20, plot = FALSE)
ACF.df.500 = with(ACF.500, data.frame(lag, acf))
# n=50 gaussian white noise ACF
w = rnorm(50,0,1) # n=500 gaussian white noise
ACF.50 = acf(w, lag.max = 20, plot = FALSE)
ACF.df.50 = with(ACF.50, data.frame(lag, acf))
#combine the acfs (population, n=500, n=50)
dat = data.frame(rbind(ACF.df.500, ACF.df.50, ACF.df.p),
n = c(rep("n=500", 21), rep("n=50", 21), rep("population", 21)))
# comparsion
ggplot(data = dat, mapping = aes(x = lag, y = acf)) +
geom_hline(aes(yintercept = 0)) +
facet_grid(n ~ . )+
geom_segment(mapping = aes(xend = lag, yend = 0))
1
n
=50
n
=500
population
0 5 10 15 20
−0.25
0.00
0.25
0.50
0.75
1.00
−0.25
0.00
0.25
0.50
0.75
1.00
−0.25
0.00
0.25
0.50
0.75
1.00
lag
a
cf
As we can see from above comnparison that as the sample size n increses from n = 50 to n = 500, the
sample autocorrelation function gets close to its popoulation counterpart, i.e.,
ρˆn(h)→ ρ(h) n→∞
Problem 2
We answer part (a) and (b) together. Assume that (wt : t ∈ N) is a white noise with mean zero and varaince
σ2w, consider replacing wt by an average of its current value and its immediate neighbors in the past and
future. That is, let vt = (wt+1 + wt + wt−1)/3. We want to simulate a sample of n = 50 and n = 500 vt,
and we want to compare the sample autocorrelation functions ρˆn(h) with the population counterpart ρ(h).
Note that it is easy to show that the population autocorrelation function is:
ρv(h) =

1 h = 0
2/3 |h| = 1
1/3 |h| = 2
0 |h| > 2
LAG = c(0:20) # lag up to 20
acf.p = c(1, 2/3, 1/3, rep(0,18)) # acf based on the above derivation
ACF.df.p = data.frame(lag = LAG, acf = acf.p) # turn the above into data frame
w = rnorm(502,0,1) # 500 N(0,1) variates
v = filter(w, sides=2, rep(1/3,3))[-c(1, 502)] # moving average
2
ACF = acf(v, lag.max = 20, plot = FALSE)
ACF.df.500 = with(ACF, data.frame(lag, acf))
w = rnorm(52,0,1) # 50 N(0,1) variates
v = filter(w, sides=2, rep(1/3,3))[-c(1, 52)] # moving average
ACF = acf(v, lag.max = 20, plot = FALSE)
ACF.df.50 = with(ACF, data.frame(lag, acf))
dat = data.frame(rbind(ACF.df.500, ACF.df.50, ACF.df.p),
n = c(rep("n=500", 21), rep("n=50", 21), rep("population", 21)))
# comparsion
ggplot(data = dat, mapping = aes(x = lag, y = acf)) +
geom_hline(aes(yintercept = 0)) +
facet_grid(n ~ . )+
geom_segment(mapping = aes(xend = lag, yend = 0))
n
=50
n
=500
population
0 5 10 15 20
0.0
0.5
1.0
0.0
0.5
1.0
0.0
0.5
1.0
lag
a
cf
As we can see from above comnparison that as the sample size n gets larger the sample autocorrelation
function gets close to its popoulation counterpart, i.e.,
ρˆn(h)→ ρ(h) n→∞
3
Problem 3
Let (wt : t ∈ Z) ∼ IIDN(0, σ2) and consider the series (xt : t ∈ Z) such that
xt = 0.5wt−1 + wtwt−2.
Determine the mean and autocovariance function of xt, and determine whether it is stationary.
E[xt] = E[0.5wt−1 + wtwt−2] = 0.5E[wt−1] + E[wt]E[wt−2] = 0 + 0× 0 = 0
we can seperate the expectation because the white noise at different time points are independent.
γx(h) = Cov(xt, xt+h)
= E[(0.5wt−1 + wtwt−2)(0.5wt+h−1 + wt+hwt+h−2)]
= (.5)2E[wt+h−1wt−1] + 0.5E[wt+h−1wtwt−2] + 0.5E[wt+hwt+h−2wt−1] + E[wtwt−2wt+hwt+h−2]
=
{
0.25σ2 + σ4, h = 0
0, |h| 6= 0
hence xt is stationary.
Problem 4
xt = µ+ wt + 0.25wt−3 with wt ∼ IIDN(0, 1), thus has autocovariance function
γ(h) =

1.0625 for h = 0
0 for h = ±1
0 for h = ±2
0.25 for h = ±3
0 for |h| > 3
An unbiased estimator of µ is
µˆ = 1
n
n∑
t=1
xt
and its standard error is
se(µˆ)2 = 1
n
n∑
h=−n
[(
1− |h|
n
)
γ(h)
]
= 2516n −
3
2n2
Confidence interval for µ
µˆ± zα/2se(µˆ)
= 1
n
n∑
t=1
xt ± zα/2

25
16n −
3
2n2
4
Problem 5
A time series with a periodic component can be constructed from
xt = U1 sin(2piw0t) + U2 cos(2piw0t),
where U1 and U2 are independent random variables with zero means and E[U1] = E[U2] = σ2. The constant
w0 determines the period or time it takes the process to make one complete cycle.
To show the stationarity of xt, we first need to check the mean function, i.e.,
µt = E[xt] = E [U1 sin(2piw0t) + U2 cos(2piw0t)]
= sin(2piw0t)E[U1]︸ ︷︷ ︸
=0
+cos(2piw0t)E[U2]︸ ︷︷ ︸
=0
= 0, which is independent of time t.
next, we need to compute the autocovariance function:
γx(t, t+ h) =Cov(xt, xt+h)
=Cov(U1 sin(2piw0t) + U2 cos(2piw0t), U1 sin(2piw0[t+ h]) + U2 cos(2piw0[t+ h]))
= sin(2piw0t) sin(2piw0[t+ h])Cov(U1, U1)︸ ︷︷ ︸
=σ2
+ sin(2piw0t) cos(2piw0[t+ h])Cov(U1, U2)︸ ︷︷ ︸
=0
+ cos(2piw0t) sin(2piw0[t+ h])Cov(U2, U1)︸ ︷︷ ︸
=0
+ cos(2piw0t) cos(2piw0[t+ h])Cov(U2, U2)︸ ︷︷ ︸
=σ2
={sin(2piw0t) sin(2piw0[t+ h]) + cos(2piw0t) cos(2piw0[t+ h])}σ2
Using the identity cos(α± β) = cos(α) cos(β)∓ sin(α) sin(β)
= cos (2piw0[t+ h]− 2piw0t)σ2
=cos(2piw0h)σ2
since the autocovariance function is also independent of time t, the series xt is stationary.
5

学霸联盟


essay、essay代写