ST305-无代写
时间:2023-12-22
ST305 Statistical Inference
1
2
Variance and higher moments
Definition. The variance of X , denoted by Var(X ), is
defined by:
Var(X ) = E [(X − E [X ])2].√
Var(X ) is called the standard deviation of X .
Alternatively,
Var(X ) = E [X 2]− (E [X ])2
The nth moment of a discrete random variable X :
E [X n] =

x :fX (x)>0
xnfX (x)
The nth moment of a continuous random variable X :
E [X n] =
∫ ∞
−∞
xnfX (x)dx
3
Example
Calculate Var(X ) if X represents the outcome when a fair die is
rolled.
Solution.
E [X ] = 1
(
1
6
)
+ 2
(
1
6
)
+ · · ·+ 6
(
1
6
)
=
7
2
E [X 2] = 12
(
1
6
)
+ 22
(
1
6
)
+ · · ·+ 62
(
1
6
)
=
91
6
Var(X ) =
91
6

(
7
2
)2
=
35
12
4
Example: Exponential random variables- variance
Consider a random variable X with the following density:
f (x) =
{
0, if x ≤ 0
1
λe
− 1
λ
x , if x ≥ 0
Find Var(X ).
Solution. Var(X ) = λ2
E [X 2] =
∫ ∞
0
x2
1
λ
e−
1
λ
x dx
using integration by parts.
5
A useful properties of variance
For any constants a and b,
Var(aX + b) = a2 · Var(X )
Notice the square on a2.
Var(X + b) = VarX = Var(−X ) : the variance is invariant to
shifting the random variable by a constant b or to reflecting
it.
Proof.
6
Example: Binomial Variance
Let X ∼ Binom(n,p), that is, P(X = i) = (ni )pi(1− p)n−i ,
i = 0,1, . . .. Find Var(X ).
Solution.
Var(X ) = E [X 2]− (E [X ])2 = E [X (X − 1)] + E [X ]− (E [X ])2
Observe first i(i − 1) = d2dt2 t i
∣∣∣
t=1
E [X (X − 1)] =
n∑
i=0
(
n
i
)
i(i − 1) · pi(1− p)n−i
=
n∑
i=0
(
n
i
)
d2
dt2
t i
∣∣∣∣∣
t=1
· pi(1− p)n−i
=
d2
dt2
(
n∑
i=0
(
n
i
)
t i · pi(1− p)n−i
)∣∣∣∣∣
t=1
=
d2
dt2
(tp + 1− p)n
∣∣∣∣
t=1
= n(n − 1)(tp + 1− p)n−2 · p2∣∣t=1 = n(n − 1)p2.
Var(X ) = n(n − 1)p2 + np − (np)2 = np(1− p). 7
Moment generating function (mgf)
Definition 2.3.6. Let X be a random variable with cdf FX .
The moment generating function (mgf) of X (or FX ),
denoted by MX (t), is
MX (t) = E[etX ].
If X is continuous, the mgf of X :
MX (t) =
∫ ∞
−∞
etx fX (x)dx if X is continuous,
If X is discrete, the mgf of X :
MX (t) =

x
etxP(X = x) if X is discrete.
8
Example
Show that the mgf of X which has a pmf fX (n) = e
−λλx
n! ,
n = 0,1, . . ., λ > 0, is MX (t) = eλ(e
t−1).
9
Exercise 1
Find the mgf of X with P(X = n) = p(1− p)n, n = 0,1,2, . . .,
0 < p < 1.
10
How does the mgf generate moments
Theorem. If X has mgf MX (t), then
E [X n] = M(n)X (0) =
dn
dtn
MX (t)
∣∣∣∣
t=0
That is, the nth moment is equal to the nth derivative of
MX (t) evaluated at t = 0.
Proof: Assuming that we can differentiate under the integral
sign (see a later section), we have
d
dt
MX (t) =
d
dt
∫ ∞
−∞
etx fX (x)dx
=
∫ ∞
−∞
(
d
dt
etx
)
fX (x)dx
=
∫ ∞
−∞
(
xetx
)
fX (x)dx = E[XetX ]
11
How does the mgf generate moments
Thus,
d
dt
MX (t)
∣∣∣∣
t=0
= E[XetX ]
∣∣∣
t=0
= E[X ]
Proceeding in an analogous manner, we can establish that
dn
dtn
MX (t)
∣∣∣∣
t=0
= E [X netX ]
∣∣∣
t=0
= E [X n]
Activity Derive the above equality.
12
Example - Gamma mgf
(Gamma mgf) Suppose X ∼ f (x):
f (x) =
1
Γ(α)βα
xα−1e−x/β, 0 < x <∞, α > 0, β > 0
where Γ(α) :=
∫∞
0 y
α−1e−ydy denotes the gamma function.
Find the mgf of X .
Solution.
MX (t) =
1
Γ(α)βα
∫ ∞
0
etxxα−1e−x/βdx
=
1
Γ(α)βα
∫ ∞
0
xα−1e−
(
1
β
−t
)
xdx
=
1
Γ(α)βα
∫ ∞
0
xα−1e−x/
(
β
1−βt
)
dx (1)
13
Example - Gamma mgf
Note that any function of the following form
f (x) =
1
Γ(a)ba
xa−1e−x/b
is a pdf =⇒ ∫∞0 1Γ(a)ba xa−1e−x/bdx = 1 =⇒
∫ ∞
0
xa−1e−x/bdx = Γ(a)ba (2)
Applying (2) to (1), we have
MX (t) =
1
Γ(α)βα
Γ(α)
(
β
1− βt

=
(
1
1− βt

if t <
1
β
(Note: The mgf of the gamma distribution exists only if t < 1/β.
This will be explored later. )
E[X ] =
d
dt
MX (t)
∣∣∣∣
t=0
=
αβ
(1− βt)α+1
∣∣∣∣
t=0
= αβ
14
Exercise 2
(Gamma mgf) Suppose X ∼ f (x):
f (x) =
1
Γ(3)
x2e−x , 0 < x <∞.
Find E [X 2] in two methods: (i) by applying the definition of
higher moments and (ii) by deriving the mgf of X and then
using the mgf.
15
Example - Binomial mgf
Suppose X has the following probability mass function:
P(X = i) =
(n
i
)
pi(1− p)n−i , i = 0,1, . . .. Find the mgf of X .
(Hint: The binomial formula:
∑n
x=0
(
n
x
)
uxvn−x = (u + v)n.)
Solution.
MX (t) =
n∑
x=0
etx
(
n
x
)
px(1− p)n−x (3)
=
n∑
x=0
(
n
x
)(
pet
)x
(1− p)n−x (4)
Hence, using the Binomial formulae we have
MX (t) =
[
pet + (1− p)]n .
Actvity Find the variance of X .
16
Discussion
Discussion Compare the two ways we have used to find the
variance of X : one via definition of higher moments and the
other via mgf. Which method is more efficient?
17
Usefulness of mgf
Theorem. Let FX () and FY () be two cdfs all of whose
moments exist. If the mgfs exist and MX (t) = MY (t) for all t
in some neighborhood of 0, then FX (u) = FY (u) for all u.
The major usefulness of the mgf: mgf can characterize a
distribution.
The major usefulness of the mgf is not in its ability to
generate moments.
18
Mgf - a property
For any constants a and b, the mgf of aX + b is given by:
MaX+b(t) = ebtMX (at).
Proof.
19
essay、essay代写