统计代写-STAT 610
时间:2021-12-20
STAT 610 Final Exam
7:00pm-8:30pm, May 6, 2021
Please show all your details/reasons for full credits.
1. Let X1, ..., Xn be independent and identically distributed random variables with a
common pdf fθ(x) =
1

e−|x|/θ, where θ is an unknown parameter and θ > 0.
(a) (2 points) Derive the UMVUE of θ.
(b) (2 points) Derive the MLE θˆ of θ and prove that your solution θˆ is the unique
MLE.
(c) (2 points) Obtain the asymptotic distribution of

n(θˆ − θ) as n→∞.
(d) (2 points) Let ϕ = 1/θ and the prior for ϕ to be the gamma distribution with pdf
1
γαΓ(α)
ϕα−1e−ϕ/γI(ϕ > 0), where α and γ are known positive constants. Derive
the Bayes estimator of θ = 1/ϕ under the squared error loss (i.e., the posterior
mean of θ = 1/ϕ).
(e) (2 points) Obtain the asymptotic relative efficiency of the MLE θˆ in (b) w.r.t. the
Bayes estimator of θ in (d).
2. Let X1, ..., Xn be independent and identically distributed positive random variables
with a common pdf
fθ(x) =
θθ
Γ(θ)
xθ−1e−θxI(x > 0),
where θ > 0 is unknown.
(a) (2 points) Let ψ(θ) = dΓ(θ)

/Γ(θ). Show that the Fisher information contained in
X1, ..., Xn is
In(θ) = n
(
dψ(θ)

− 1
θ
)
(b) Consider the hypotheses H0 : θ = θ0 versus H1 : θ 6= θ0, where θ0 is a fixed
positive constant.
i. (2 points) Derive the Wald test with a given significance level α (do not try
to find an explicit formula for the MLE of θ).
ii. (2 points) Derive the score test with a given significance level α.
(c) Consider the hypotheses H0 : θ ≤ θ0 versus H1 : θ > θ0, where θ0 is a fixed
positive constant.
i. (2 points) Show that the family of joint pdf’s of X1, ..., Xn has a monotone
likelihood ratio in T =
∑n
i=1(logXi −Xi).
ii. (1 point) Show that the UMP test of size α ∈ (0, 1) rejects H0 if and only
if T > c with a constant c satisfying α = Pθ0(T > c), where Pθ0 is the
probability under θ = θ0 (you do not need to find an explicit form of c).
(d) (2 points) Let Fθ(t) be the cdf of T . Show that Fθ(t) is non-increasing in θ for each
fixed t and, using this result, obtain a confidence interval for θ with confidence
coefficient 1− α.
3. Let X1, ..., Xn be independent and identically distributed random variables with a
common pdf θ−1f(x/θ), where θ > 0 is unknown and f is a known pdf satisfying
f(x) = f(−x).
(a) (2 points) Show that (
∑n
i=1X
2
i /cα)
1/2 is a lower confidence bound for θ with
confidence coefficient 1−α, where cα is the 1−α quantile of the cdf of θ−2∑ni=1X2i .
(b) (3 points) Suppose that f(x) is the pdf of the standard normal. Show that the
lower confidence bound in (a) is (0, θ)-UMA, where θ is the true parameter value.
Compared with other lower confidence bounds for θ, what can you say why the
bound in (a) is better?
(c) (4 points) Suppose that f(x) = x when 0 < x < 1 and f(x) = 0 when x ≥ 1. Let
T = maxi=1,...,n |Xi|. Show that among all confidence intervals of θ in the class{(
T
b
,
T
a
)
: 0 < a < b ≤ 1 are constants satisfying P
(
T
b
≤ θ ≤ T
a
)
= 1− α
}
the one with b = 1 and a = α1/(2n) has the shortest length.
4. Let X1, ..., Xn be independent and identically distributed random variables with a
common pdf φ−1e−x/φI(x > 0) and Y1, ..., Yn be independent and identically distributed
random variables with a common pdf ϕ−1e−y/ϕI(y > 0), where both φ > 0 and ϕ > 0
are unknown parameters and Xi’s and Yi’s are independent. Let θ = (φ, ϕ)
′. Consider
testing
H0 : R(θ) = 0 versus H1 : R(θ) 6= 0
where R(θ) = φ− c ϕ and c is a known positive constant.
(a) (2 points) Under the hypothesis H0 : R(θ) = 0, obtain the MLE’s of φ and ϕ.
(b) (3 points) Derive the Wald test statistic
Wn = [R(θˆ)]
′{[C(θˆ)]′[In(θˆ)]−1C(θˆ)}−1R(θˆ)
where θˆ is the MLE of θ, C(θ) = ∂R(θ)/∂θ, and In(θ) is the Fisher information
contained in X1, ..., Xn, Y1, ..., Yn.
(c) (2 points) Derive the score test statistic Rn.
(d) (3 points) Obtain a 1−α asymptotic confidence interval for φ/ϕ by inverting the
acceptance regions of Wald’s tests in (b) with different c’s.
2
Solution
1. (a) The family of pdf’s is an exponential family with complete and sufficient statistic
T =
∑n
i=1 |Xi|. Since E|Xi| = θ, T/n is unbiased for θ and hence it is the UMVUE
of θ.
(b) The log-likelihood is −n log θ − T/θ, T = ∑ni=1 |Xi|. The score function S(θ) =
T/θ2− n/θ, and S(θ) = 0 has a unique solution θˆ = T/n. Since S(θ) = n
θ
( θˆ
θ
− 1),
which is negative if θ < θˆ and positive if θ > θˆ, θˆ is the unique MLE.
(c) We apply Theorem 10.1.12. Since S ′(θ) = n
θ2
− 2T
θ3
, the Fisher information is
−E{S ′(θ)} = 2E(T )
θ3
− n
θ2
= n
θ2
. Hence the asymptotic distribution of

n(θˆ− θ) is
N(0, θ2).
(d) The posterior of ϕ is proportional to
ϕα−1e−ϕ/γϕne−Tϕ = ϕn+α−1e−ϕ(T+γ
−1)
which is the gamma distribution with shape parameter n+α and scale parameter
(T + γ−1)−1. The Bayes estimator of θ = 1/ϕ is
θˆB =
(T + γ−1)n+α
Γ(n+ α)
∫ ∞
0
ϕn+α−2e−ϕ(T+γ
−1)dϕ
=
(T + γ−1)n+α
Γ(n+ α)
Γ(n+ α− 1)
(T + γ−1)n+α−1
=
T + γ−1
n+ α− 1
(e) Note that

n(θˆ − θˆB) =

n
(
θˆ − T + γ
−1
n+ α− 1
)
=

n
θˆ − θˆ + 1γn
1 + α−1
n

=
1
1 + α−1
n
(
α− 1√
n
θˆ − 1
γ

n
)
which tends to 0 as n→∞. Hence, the asymptotic relative efficiency is 1.
2. (a) The likelihood is
L(θ) =
θnθ
Γ(θ)n
exp
{
−θ
n∑
i=1
Xi + (θ − 1)
n∑
i=1
logXi
}
The log likelihood is
`(θ) = nθ log θ − n log Γ(θ)− θ
n∑
i=1
Xi + (θ − 1)
n∑
i=1
logXi
= nθ log θ − n log Γ(θ) + θT −
n∑
i=1
logXi
3
where T =
∑n
i=1(logXi −Xi).
S(θ) = `′(θ) = n log θ + n− nΓ′(θ)/Γ(θ) + T = n log θ + n− nψ(θ) + T
by the definition of ψ(θ), and
S ′(θ) =
n
θ
− nψ′(θ)
Since S ′(θ) is non-random, the Fisher information In(θ) = −S ′(θ).
(b) The Wald test statistic for H0 : θ = θ0 is
Wn = (θˆ − θ0)2In(θˆ) = n(θˆ − θ0)2
(
ψ′(θˆ)− 1
θˆ
)
where θˆ is the MLE of θ. The Wald test with significant level α rejects if Wn > χ
2
α,
where χ2α is the 1 − α quantile of the chi-square distribution with one degree of
freedom.
The score test statistic for H0 : θ = θ0 is
Rn = S(θ0)
2{In(θ0)}−1 = {n log θ0 + n− nψ(θ0) + T}2
(
ψ′(θ0)− 1
θ0
)−1
The score test with significant level α rejects if Rn > χ
2
α.
(c) For θ1 < θ2,
L(θ2)
L(θ1)
=
θnθ22 Γ(θ1)
n
θnθ11 Γ(θ2)
n
exp {(θ2 − θ1)T}
which is an increasing function in T .
It follows from Theorem 8.3.17 that a UMP test of size Pθ0(T > c) rejects H0
when T > c for c satisfying α = Pθ0(T > c).
(d) The fact that Fθ(t) is non-increasing in θ for every fixed t follows from Lemma B
in Lecture 15 (see the last two lines on page 11 of Lecture 15). Then, by Theorem
9.2.12, a confidence interval with confidence coefficient 1−α is [L(T ), U(T )] with
U(t) = sup{θ : Fθ(t) ≥ α1}, L(t) = inf{θ : Fθ(t) ≤ 1− α2}
where α1 + α2 = α.
3. (a) Since the pdf of Xi/θ is f , (
∑n
i=1X
2
i )
1/2/θ is pivotal, so is
∑n
i=1 X
2
i /θ
2. Then

θ ≥
√√√√ n∑
i=1
X2i /cα
 = Pθ
(
n∑
i=1
X2i /θ
2 ≤ cα
)
= 1− α
by the definition of cα.
4
(b) When f is normal, the likelihood is in an exponential family having MLR in
Y =
∑n
i=1 X
2
i . It follows from Theorem 8.3.17 that a UMP test of size α rejects
H0 : θ ≤ θ0 when Y/θ20 > cα. The acceptance region is {Y/θ20 ≤ cα}. Inverting the
acceptance regions gives lower confidence bound (Y/cα)
1/2. By Theorem 9.3.5, the
resulting lower confidence bound is (0, θ)-UMA with confidence coefficient 1− α.
Let C(X) be the confidence set corresponding to this bound. If C1(X) is another
confidence set with level 1− α, then
Pθ(θ
′ ∈ C(X)) ≤ Pθ(θ′ ∈ C1(X))
for all wrong value θ′ ∈ (0, θ).
(c) By symmetry, the pdf of |Xi|/θ is 2tI(0 < t < 1). Then the pdf of T/θ is
2nt2n−1I(0 < t < 1). Hence,
P
(
T
b
≤ θ ≤ T
a
)
= P
(
a ≤ T
θ
≤ b
)
= 2n
∫ b
a
t2n−1dt = b2n − a2n
i.e.,
1− α = b2n − a2n
and
b2n−1 − a2n−1da
db
= 0 or
da
db
=
b2n−1
a2n−1
The length of
(
T
b
, T
a
)
is
T
a
− T
b
= T
(
1
a
− 1
b
)
Note that
d
db
(
1
a
− 1
b
)
=
1
b2
− 1
a2
da
db
=
1
b2
− 1
a2
b2n−1
a2n−1
=
a2a2n−1 − b2b2n−1
b2a2a2n−1
< 0
as b > a > 0. This means the length is a decreasing function of b so that the
shortest length interval is the one with b = 1, which implies that a = α1/(2n).
4. (a) Let T =
∑n
i=1Xi and S =
∑n
i=1 Yi. The joint likelihood is
L(θ) =
1
φnϕn
exp(−T/φ− S/ϕ)
Under R(θ) = 0, i.e., φ = cϕ, the likelihood becomes
1
cnϕ2n
exp{−(c−1T + S)/ϕ}
Hence, the MLE of ϕ is ϕ˜ = (c−1T + S)/(2n) and the MLE of φ is φ˜ = cϕ˜ =
(T + cS)/(2n).
5
(b) The MLE of θ is θˆ = (φˆ, ϕˆ)′ with φˆ = T/n and ϕˆ = S/n. C(θ) = (1,−c)′,
sn(θ) =
∂ logL(θ)
∂θ
=
(
T
φ2
− n
φ
,
S
ϕ2
− n
ϕ
)′
∂2 logL(θ)
∂θ∂θ′
=
 −2Tφ3 + nφ2 0
0 −2S
ϕ3
+ n
ϕ2

Hence,
In(θ) = n
 1φ2 0
0 1
ϕ2

and
Wn =
n(φˆ− cϕˆ)2
φˆ2 + c2ϕˆ2
(c) Let θ˜ = (φ˜, ϕ˜)′ be the MLE under H0. Then
Rn = [sn(θ˜)]
′[In(θ˜)]−1sn(θ˜)
= n−1
(
T
φ˜2
− n
φ˜
,
S
ϕ˜2
− n
ϕ˜
)′ (
φ˜2 0
0 ϕ˜2
)(
T
φ˜2
− n
φ˜
,
S
ϕ˜2
− n
ϕ˜
)
=
1
n
(
T
φ˜
− n
)2
+
1
n
(
S
ϕ˜
− n
)2
(d) For every c > 0, let Wn(c) =
n(φˆ−cϕˆ)2
φˆ2+c2ϕˆ2
. Let d be the 1−α quantile of the chi-square
distribution with one degree of freedom. For every c, the acceptance region of the
Wald test is
{X1, ..., Xn, Y1, ..., Yn : Wn(c) ≤ d}
A 1− α asymptotic confidence interval for φ/ϕ is
{c : Wn(c) ≤ d} =
{
c :
(φˆ− cϕˆ)2
φˆ2 + c2ϕˆ2
≤ d
n
}
which is an interval with limits equal to the two solutions of
(φˆ− cϕˆ)2 = d
n
(φˆ2 + c2ϕˆ2)
i.e.,
(1− d/n)ϕˆ2c2 − 2φˆϕˆc+ (1− d/n)φˆ2 = 0
which are
2φˆϕˆ±

4φˆ2ϕˆ2 − 4(1− d/n)2φˆ2ϕˆ2
2(1− d/n)ϕˆ2 =
φˆ± φˆ

1− (1− d/n)2
(1− d/n)ϕˆ
6




essay、essay代写