MA280-无代写
时间:2024-01-04
MA280 Probability
1
Exercise
Exercise 1. Consider a sequence of independent Bernoulli
trials, each of which is a success with probability p. Let X1 be
the number of failures preceding the first success, and let X2 be
the number of failures between the first two successes. Find
the joint mass function of X1 and X2.
2
Example
Suppose that 3 balls are randomly selected from an urn
containing 3 red, 4 white, and 5 blue balls. If we let X and Y
denote, respectively, the number of red and white balls chosen,
find the joint probability mass function of X and Y .
Solution. Joint prob mass function: p(i , j) = P{X = i ,Y = j}
Note X = i ,Y = j if, of the 3 balls selected, i are red, j are
white, and 3− i − j are blue, and
all subsets of size 3 are equally likely to be chosen. =⇒
p(i , j) =
(
3
i
)(
4
j
)(
5
3− i − j
)
(
12
3
) i = 0,1,2,3, j = 0,1,2,3.
3
Exercise
Exercise 2. Suppose that 4 balls are randomly selected from
an urn containing 3 red, 4 white, and 5 blue balls. If we let X
and Y denote, respectively, the number of red and blue balls
chosen, find the joint probability mass function of X and Y .
4
Marginal mass functions -summary
The marginal mass functions are
pX (xi) := P {X = xi} , and pY
(
yj
)
:= P
{
Y = yj
}
.
From the Law of Total Probability that
pX (xi) =
∑
j
p
(
xi , yj
)
, and pY
(
yj
)
=
∑
i
p
(
xi , yj
)
.
5
Joint probability density function -summary
X and Y are jointly continuous if
P{(X ,Y ) ∈ C} =
∫∫
(x ,y)∈C
f (x , y)dxdy
for all set C in the two-dimensional plane
The function f (x , y) above is the joint probability density
function of X and Y .
P{X ∈ A,Y ∈ B} = ∫B ∫A f (x , y)dxdy
F (a,b) = P{X ∈ (−∞,a],Y ∈ (−∞,b]}
=
∫ b
−∞
∫ a
−∞
f (x , y)dxdy
f (a,b) = ∂
2
∂a∂bF (a,b)
the marginal pdf of X :
fX (x) =
∫ ∞
−∞
f (x , y)dy
6
Example
The joint density function of X and Y is given by
f (x , y) =
{
2e−xe−2y 0 < x <∞,0 < y <∞
0 otherwise
Compute (a) P{X > 1,Y < 1} and (b) P{X < Y}.
Solution.a.
P(X > 1,Y < 1) =
∫ 1
0
∫ ∞
1
2e−xe−2ydxdy
= (
∫ ∞
1
e−xdx)(
∫ 1
0
2e−2ydy) = e−1
(
1− e−2
)
7
Example
b.
P{X < Y} =
∫∫
(x ,y):x2e−xe−2ydxdy
=
∫ ∞
0
∫ y
0
2e−xe−2ydxdy
=
∫ ∞
0
2e−2y
(
1− e−y)dy
=
∫ ∞
0
2e−2ydy −
∫ ∞
0
2e−3ydy
= 1− 2
3
=
1
3
Exercise 3. Find P(X + Y > 1).
8
Exercise
Exercise 4. The joint pdf of X and Y is given by
a. Verify that this is indeed a joint density function.
b. Find P(X < Y ) and P(X < a).
9
Example
The joint density function of X and Y is given by
f (x , y) =
{
2e−xe−2y 0 < x <∞,0 < y <∞
0 otherwise
Derive the marginal pdf of X . Calculate P(X > 5).
Solution.
10
Independent Random Variables
Definition: Random variables X and Y are independent, if
for every A,B
P{X ∈ A,Y ∈ B} = P{X ∈ A} · P{Y ∈ B}.
Two random variables X and Y are independent if and
only if their joint pdf or pmf function factorises into the
product of the marginals:
p (x , y) = pX (x) · pY (y) , (∀x , y)
i .i .d .: an abbreviation for independent and identically
distributed random variables
11
Conditional pdf of independent rv
If X and Y are independent, the conditional pdf of Y given
X = x is
f (y | x) = f (x , y)
fX (x)
=
fX (x)fY (y)
fX (x)
= fY (y)
12
Example - Checking independence
Consider the discrete bivariate random vector (X ,Y ), with joint
pmf given by
f (10,1) = f (20,1) = f (20,2) =
1
10
f (10,2) = f (10,3) =
1
5
, and f (20,3) =
3
10
.
Are X and Y independent?
Solution.The marginal pmfs:
fX (10) = fX (20) =
1
2
and fY (1) =
1
5
, fY (2) =
3
10
, and fY (3) =
1
2
f (10,3) =
1
5
̸= 1
2
1
2
= fX (10)fY (3)
The random variables X and Y are not independent because
f (x , y) cannot be factorized into marginal pmfs for some x and
y .
13
Example
Suppose that the number of customers entering the post office
is of Poi(λ) distribution. Assume also that each person,
independently of each other, is female with probability p and
male with probability 1− p. Find the joint distribution of the
number X of females and Y of males, the marginal distributions
of X and Y . Are X and Y independent?
Solution.
p(i , j) = P{X = i ,Y = j}
= P{X = i ,Y = j | X + Y = i + j} · P{X + Y = i + j}
P{X + Y = i + j} = e−λ · λ
i+j
(i + j)!
Since given the total number X + Y of people, each of them is
independently female with probability p, or male with probability
1− p =⇒
Given X + Y = i + j , (X | X + Y = i + j) ∼ Binom(i + j ,p).
14
Example - solution (continued)
Therefore,
P{X = i ,Y = j | X + Y = i + j} = P{X = i | X + Y = i + j}
=
(
i + j
i
)
pi(1− p)j .
p(i , j) =
(
i + j
i
)
pi(1− p)j · e−λ λ
i+j
(i + j)!
= e−λpi(1− p)j · λ
i · λj
i! · j!
= e−λp
(λp)i
i!
· e−λ(1−p) (λ(1− p))
j
j!
= pPoi(λp)(i) · pPoi(λ(1−p))(j)
Thus,
X ∼ Poi(λp);
Y ∼ Poi(λ(1− p));
X and Y are independent.
15
More theorems
X and Y are independent random variables if and only if
there exist functions g(x) and h(y) such that, for every
x ∈ ℜ and y ∈ ℜ,
f (x , y) = g(x)h(y)
Theorem. Let X and Y be independent random variables.
P(X ∈ A,Y ∈ B) = P(X ∈ A)P(Y ∈ B);
that is, the events {X ∈ A} and {Y ∈ B} are independent
events.
g(X ) and h(Y ) are independent
E[g(X )h(Y )] = E[g(X )]E [h(Y )]
16
Exercise
Exercise 5. (a) The joint density of X and Y is given by
Are X and Y independent?
(b) The joint density of X and Y is given by
Are X and Y independent?
17
Example
Let X and Y be independent exponential(1) random variables.
P(X ≥ 4,Y < 3) = P(X ≥ 4)P(Y < 3) = e−4
(
1− e−3
)
.
Letting g(x) = x2 and h(y) = y , we see that
E
(
X 2Y
)
=
(
E[X 2]
)
(E[Y ]) =
(
Var(X ) + (E[X ])2
)
E[Y ] =
(
1 + 12
)
1 = 2
18
Example
The results regarding independence can be generalized to
multiple random variables.
Example. Let X ,Y ,Z be independent and uniformly distributed
over (0,1). Compute P{X ≥ YZ}.
Solution.
fX ,Y ,Z (x , y , z) = fX (x)fY (y)fZ (z)
= 1, 0 ≤ x ≤ 1,0 ≤ y ≤ 1,0 ≤ z ≤ 1
we have
P{X ≥ YZ} =
∫∫∫
x≥yZ
fX ,Y ,Z (x , y , z)dxdydz
=
∫ 1
0
∫ 1
0
∫ 1
yz
dxdydz =
∫ 1
0
∫ 1
0
(1− yz)dydz
=
∫ 1
0
(
1− z
2
)
dz =
3
4
19
Sum of independent discrete random variables
Example Suppose X and X are independent Poisson random
variables with respective parameters λ1 and λ2. Compute the
distribution of X1 + X2.
Solution.
Note {X + Y = n} = ∪nk=0{X = k ,Y = n − k}
P{X + Y = n} =
n∑
k=0
P{X = k ,Y = n − k}
=
n∑
k=0
P{X = k}P{Y = n − k}
=
n∑
k=0
e−λ1
λk1
k !
e−λ2
λn−k2
(n − k)!
=e−(λ1+λ2)
n∑
k
λk1λ
n−k
2
k !(n − k)!
=
e−(λ1+λ2)
n!
n∑
k
n!
k !(n − k)!λ
k
1λ
n−k
2
=
e−(λ1+λ2)
n!
(λ1 + λ2)
n
20
Distribution of sum of independent continuous
random variables
Suppose that X and Y are independent, continuous
random variables having probability density functions fX
and fY . Then
FX+Y (a) = P{X + Y ≤ a}
=
∫ ∫
x+y≤a
fX (x)fY (y)dxdy =
∫ ∞
−∞
∫ a−y
−∞
fX (x)fY (y)dxdy
=
∫ ∞
−∞
∫ a−y
−∞
fX (x)dxfY (y)dy
=
∫ ∞
−∞
FX (a− y)fY (y)dy
The cumulative distribution function FX+Y is called the
convolution of the distributions FX and FY (the cumulative
distribution functions of X and Y , respectively).
21
Distribution of sum of independent continuous
random variables
Suppose that X and Y are independent, continuous
random variables having probability density functions fX
and fY . The probability density function fX+Y of X + Y is:
fX+Y (a) =
d
da
∫ ∞
−∞
FX (a− y)fY (y)dy
=
∫ ∞
−∞
d
da
FX (a− y)fY (y)dy
=
∫ ∞
−∞
fX (a− y)fY (y)dy
22
Linear transformation of Normal rv
If X ∼ Normal(µ, σ2), then aX + b ∼ Normal(aµ+ b,a2σ2).
Exercise 6. Prove the above statement.
23
Example - standard normal rv
Suppose Z1 ∼ Normal(0,1), Z2 ∼ Normal(0,1), and Z1 and Z2
are independent. Find the distribution of Z1 + Z2.
Solution.
fZ1+Z2(a) =
∫ ∞
−∞
fX (a− y)fY (y)dy
=
∫ ∞
−∞
1√
2π
exp
{
−(a− y)
2
2
}
1√
2π
exp
{
−y
2
2
}
dy
=
∫ ∞
−∞
1
2π
exp
(
−y2 + ay − a
2
2
)
dy
Thus Z1 + Z2 ∼ Normal(0,2).
24
Example - standard normal rv
Suppose X1 ∼ Normal(0, σ21), X2 ∼ Normal(0, σ22), and X1 and
X2 are independent. Show that X1 + X2 ∼ Normal(0, σ21 + σ22).
Solution.
fX1+X2(a) =
∫ ∞
−∞
fX (a− y)fY (y)dy
=
∫ ∞
−∞
1√
2πσ1
exp
{
− (a− y)
2
2σ21
}
1√
2πσ2
exp
{
− y
2
2σ22
}
dy
=
1
2πσ1σ2
exp
{(
1
2σ12
+
1
2σ22
) a2σ12(
1
2σ12
+ 12σ22
)
2 − a2
2σ12
}
∗
∫ ∞
−∞
exp
{
−
(
1
2σ12
+
1
2σ22
)y − a2σ12(
1
2σ12
+ 12σ22
)
2}
25
Example - standard normal rv
fX1+X2(a)
=
1
2πσ1σ2
exp
{(
1
2σ12
+
1
2σ22
) a2σ12(
1
2σ12
+ 12σ22
)
2 − a2
2σ12
}
∗
∫ ∞
−∞
exp
{
−c1x2
}
dx
= C2 exp
{(
1
2σ12
+
1
2σ22
) a2σ12(
1
2σ12
+ 12σ22
)
2 − a2
2σ12
}
= C2 exp
{(
− a
2
2(σ12 + σ22)
)}
,
where c1 and C2 are constants that do not depend on a.
Thus X1 + X2 ∼ Normal(0, σ21 + σ22).
26
Normal random variables
If X1 ∼ Normal(µ1, σ21) and X2 ∼ Normal(µ2, σ22), and X1
and X2 are independent, then
X1 + X2 ∼ Normal(µ1 + µ2, σ21 + σ22).
Proof.
Note Y1 := (X1 − µ1) /σ2 ∼ Normal(0, σ21/σ22)
Y2 := (X2 − µ2) /σ2 ∼ Normal(0,1)
It follows from our previous result that
(X1 − µ1) /σ2 + (X2 − µ2) /σ2 ∼ Normal(0,1 + σ21/σ22)
=⇒ X1 + X2 = σ2(Y1 + Y2) + (µ1 + µ2) is normal with
mean µ1 + µ2 and variance σ22
(
1 + σ21/σ
2
2
)
= σ21 + σ
2
2.
27
Normal random variables
If Xi ∼ Normal(µ,σ2i ), and X1, X2, . . ., Xn are mutually
independent, then
∑
i=1
Xi ∼ Normal(
N∑
i=1
µi ,
∑
i=1
σ2i ).
Proof. Use induction.
28
Example 3a: Sum of two independent uniform
random variables
If X and Y are independent random variables, both uniformly
distributed on (0,1), calculate the probability density of X + Y .
bs
Solution.Note
fX (a) = fY (a) =
{
1 0 < a < 1
0 otherwise
So we obtain fX+Y (a) =
∫ 1
0 fX (a− y)dy =⇒
For 0 ≤ a ≤ 1, fX+Y (a) =
∫ a
0 dy = a
For 1 < a < 2, fX+Y (a) =
∫ 1
a−1 dy = 2− a
Hence,
fX+Y (a) =
a 0 ≤ a ≤ 1
2− a 1 < a < 2
0 otherwise
29
Example- Sum of two independent uniform
random variables
Note the shape of its density function:
The random variable X + Y is said to have a triangular
distribution.
30