程序代写案例-MSIN0106
时间:2022-03-02
MSIN0106
Exam for 2019-2020
Answer all questions. Each question carries equal weight (25 points).
1
Question 1
Consider the regression of Morgan Stanley daily stock prices, MSt, on the daily SP500
index, SPt:
MSt = β1 + β2SPt + ut = β
′xt + ut, t=1,...,T, (1)
where xt is a 2x1 vector with elements 1 and SPt. Assume that E[xtut] = 0.
1. (6 points) Suppose you obtain OLS estimates βˆ1 = −46 and βˆ2 = .2. with
HAC standard errors respectively equal to 2.18 and 0.01. Discuss how/whether
you would use this regression output to test the hypothesis H0 : β2 = 0, and
mention any other analysis you would carry out to support your decision.
A: The standard errors may not be valid if the regression is spurious (which is
likely, as we are considering stock prices). You can run a unit root test on the
residuals. If reject, can use output to do a t-test= .2
0.01
> 1.96 so reject.
2. (6 points) Answer question 1.1 again, in light of the fact that the time series of
estimated residuals from regression (1), uˆt, looks as follows:
A: The errors look non-stationary (the are smooth and have a trend), suggesting
we are in the presence of a spurious regression and so cannot use output. Again,
you can run a unit root test on the residuals to confirm
3. (6 points) Explain how you would test whether returns on the SP500 index
Granger-cause Morgan Stanley stock returns.
A: Compute returns on both SP500 and MS by taking log price differences from
the previous day, then regress MS returns on lags of MS returns and lags of
the SP500 returns. Test if lags of SP500 are jointly significant. If they are,
conclude there is Granger causality.
2
4. (7 points) Discuss whether you think an Error Correction Model would be the
right model for the vector (MSt, SPt)
′, based on your answer to question 1.2.
A: An ECM is the right model only if MSt and SPt are cointegrated. If in
question 1.2 we found a unit root (as is likely), the two are not cointegrated so
the ECM is not the right model. If we reject a unit root, this is evidence of
cointegration so an ECM is appropriate
CONTINUED
3
Question 2
Consider the model
yt = −c1 ∗ 1(t ≤ T/2) + c2 ∗ 1(t > T/2) + εt + θ1εt−1 + θ2εt−2, t = 1, ..., T, (2)
where εt ∼ i.i.d.N(0, 1) and 1(·) equals 1 when the statement within parentheses is
true and equals zero otherwise.
1. (6 points) Under which conditions on the parameters is the model stationary?
Find the (unconditional) mean and variance of yt under these conditions.
A: When c1 = c2 = c, in which case we have an MA(2) with mean c and variance
1+θ21+θ
2
2
2. (6 points) Under which conditions is the model invertible?
A: Roots of equation 1 + θ1x+ θ2x
2 = 0 greater than 1 in absolute value
3. (6 points) Given a quadratic loss function, what is the optimal 1-step ahead
forecast at time T based on model (2)?
A: 1-step ahead forecast is c2 + θ1εT + θ2εT−1
4. (7 points) What are the parameters of model (2) and how would you estimate
them?
A: Parameters are c1, c2, θ1, θ2. You can estimate them by MLE
CONTINUED
4
Question 3
Consider the model
yt = σtzt︸︷︷︸
εt
, t = 1, ..., T, (3)
σ2t = w + α1ε
2
t−1 + α2ε
2
t−2 (4)
where zt ∼ i.i.d.N(0, 1).
1. (5 points) Derive the unconditional mean and unconditional variance of yt.
A: E[yt] = 0 and V ar[yt] =
w
1−α1−α2
2. (6 points) Explain how you would estimate the parameters by OLS (i.e., by
running a regression).
A: This is an ARCH(2), which is an AR(2) for ε2t . ε
2
t can be approximated here
(due to the lack of a conditional mean) with y2t , so estimate by regressing y
2
t on
y2t−1 and y
2
t−2
3. (7 points) Suppose that, after running the regression in question 3.2, you find
that the residuals of the regression are autocorrelated. Explain how you would
modify the model for yt and how you would estimate the new model.
A: You need to increase the order of the ARCH (e.g., start with ARCH(3) and
increase progressively until you don’t find any residual autocorrelation). You
can still estimate these models by OLS.
4. (7 points) Explain how you would construct an estimate of the standardized
residual, zˆt, using results from the regression in question 3.2.
A: zˆt =
yt
σˆt
where σˆt =

wˆ + αˆ1y2t−1 + αˆ2y
2
t−2
CONTINUED
5
Question 4
Consider the model
yt = a+ bt+ φyt−1 + εt (5)
εt ∼ i.i.d.N(0, σ2); t = 1,...,T.
1. (5 points) Under which conditions on the parameters is the model covariance-
stationary (in words, no need to prove it formally)?
A: b = 0 so the model does not have a time trend; |φ| < 1; so the root of the
lag polynomial equation is outside the unit circle
2. (5 points) Derive the autocorrelogram of εt.
A: Since it is a white noise the autocorrelogram equals zero at all j > 0
3. (5 points) Derive the conditional mean E[yt|Ωt−1].
A: E[yt|Ωt−1] = a+ bt+ φyt−1
4. (5 points) Derive the conditional density fyt|Ωt−1 .
A: N(a+ bt+ φyt−1, σ2)
5. (5 points) Derive the Mean Squared Forecast Error for the one-step-ahead opti-
mal forecast at time t implied by the model (assuming a quadratic loss function).
A: E[yt+1 − a− b(t+ 1)− φyt]2 = E[ε2t+1] = σ2
END OF PAPER
6


essay、essay代写