THE UNIVERSITY OF NEW SOUTH WALES
DEPARTMENT OF STATISTICS
MATH3811/MATH3911- Statistical Inference/Higher Statistical Inference
ASSIGNMENT 1
Please, add a cover page containing a copy of your ID card, write with your own
handwriting:
“I declare that this assignment is my own work, except where acknowledged. I have
read and understood the University Rules regarding Academic Misconduct”, and sign
it.
Assignment due: Thursday, 10th March 2022, 1 pm at the latest. To be submitted
via the assignment link on Moodle.
Math3811: Attempt all questions except for the star question; Math3911:
Attempt all questions.
1. In a manufacturing process in which glass items are being produced, defects in a
form of bubbles occur, and if there is more than one bubble, the item is considered
defective. The X number of bubbles is thought to follow a Poisson(µ) distribution
Pµ(X = x) = f(x, µ) =
eµµx
x!
, x = 0, 1, 2, . . . .
Let a sample X = (X1, X2, . . . , Xn) of such Poisson random variables be given.
a) Suggest a simple unbiased estimator W of the parameter ⌧(µ) = P (X 1) =
(1 + µ)eµ (i.e., of the probability of less than equal to one defect).
b) Argue that T =
Pn
i=1Xi is complete and minimal sucient for µ. Hence (or
otherwise) derive the UMVUE of ⌧(µ).
c) What is the MLE ⌧ˆ of ⌧(µ)? Find the asymptotic distribution of this MLE (i.e.,
state the asymptotic distribution of
p
n(⌧ˆ ⌧(µ))).
d) Defects have been counted on 10 glass items giving the following result
0, 3, 2, 1, 3, 4, 0, 1, 2, 2.
Find a point estimate of the probability ⌧() using the data given by applying the
methods in (b) and (c). Compare the two numerical values and provide a comment.
e) Is the variance of the proposed estimator in (b) equal to the variance given by the
Cramer-Rao lower bound for an unbiased estimator of ⌧()? Explain. Calculate the
Cramer-Rao lower bound explicitly.
f) Find an asymptotic 95% confidence interval for ⌧(µ) using the available data and
the asymptotic distribution in c).
1
2. Suppose the discrete random X has the following probability mass function
fX(x; ✓) =
✓
3
4
◆x(x+1)
2
✓
1
4
◆x(x1)
2
✓x
2
(1 ✓)1x2 , x 2 {1, 0, 1}.
a) Is T1 = X sucient for ✓? Give reasons.
b) Is T2 = X2 sucient for ✓? Give reasons.
c) Is T1 = X complete for ✓? Give reasons.
d) Is T2 = X2 complete for ✓? Give reasons.
3. Let X1, X2, . . . , Xn be i.i.d. random variables with density
f(x; ✓) =
⇢
e(✓x)//, if x ✓
0 else
,
✓ being unknown parameter and > 0 a fixed constant.
a) Sketch a graph of a density from this family for fixed ✓ = 1 and = 1.
b) Find the cumulative distribution function FX(1)(y; ✓) and the density fX(1)(y; ✓)
of the smallest of the observations, X(1). (Don’t treat ✓ and as specific values any
more).
Hint: you could use
P (X(1) y) = 1 P (X1 y,X2 y, . . . , Xn y) = 1 [P (X1 y)]n.
c) Justify that X(1) is a minimal sucient statistic for ✓.
d) Find EX(1).
e) Show that X(1) is complete. Hence find the UMVUE of ✓.
4. (*) a) i) Suppose we are interested in the parameter ✓ and ✓ˆ(X1, . . . , Xn) is an esti-
mator of ✓ with E(✓ˆ) ✓ = b(✓). Show that
V ar(✓ˆ)
1 +
@b(✓)
@✓
2
IX(✓)
1,
where IX(✓) is the Fisher information. Assume that you can interchange the or-
der of integration and di↵erentiation. You may need to use the well known result
(Cauchy-Schwartz Inequality) that for any two random variables U and V we have
Cov(U, V )2 Var(U)Var(V ).
ii) Suppose X = (X1, X2, ..., Xn) are an i.i.d sample from N(0, ✓) population and
we use the MLE
✓˜ =
1
n
nX
i=1
(Xi X¯)2
2
x
to estimate the variance ✓. Calculate the bound in (i) and compare it with the
variance of the estimator ✓˜. You may use the fact that
(n 1)S2
✓
⇠ 2n1
where S2 = 1n1
Pn
i=1(Xi X¯)2.
b) i) A random sample of size n is available from the distribution with density
f(x; ✓) =
✓3x(x+ 1)
✓ + 2
e✓x if x 0,
otherwise zero. Find a function ⌧ of ✓ for which there exists an unbiased estimator
that attain the Cramer-Rao lower bound.
ii) Calculate the Fisher information.
iii) Using (a) and (b) or otherwise obtain a closed form expression for Var(X¯).
3