MAST20006/MAST90057-无代写
时间:2023-05-08
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
Module 6. Stochastic Processes and Markov
Chains
Sophie Hautphenne and Feng Liu
The University of Melbourne
2023
1/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
Stochastic Processes and Markov Chains
A (discrete time) stochastic processes is a collection of r.v.’s
Xt, t ∈ T . For example,
Xt could be the number of patients visiting a hospital on day t,
Xt could be the ASX All Ords index value at time t,
Xt could be the amount of rainfall in month t.
For each t ∈ T , Xt takes values in a set S. We call S the state
space of the stochastic process. The set T is called the index set.
The state space S may be discrete or continuous. The index set T
may also be discrete or continuous.
2/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
We only consider the case where both S and T are discrete. So
S = {s1, s2, · · · , sN} or S = {s1, s2, · · · }, and T = {t1, t2, . . . tM}
or T = {t0, t1, · · · }.
One can encode the states by giving sj the label j, so we can
assume that S = {1, 2, . . . , N} or S = {1, 2, . . .}.
Similarly, we can assume that T = {0, 1, 2, . . . ,M} or
T = {0, 1, 2, . . .}.
Definition 1
A stochastic process {Xn}∞n=0 is called a discrete-time discrete-state
Markov Chain if for all n ≥ 0 and j0, j1, · · · , jn, jn+1 ∈ S,
P (Xn+1 = jn+1 | X0 = j0, X1 = j1, · · · , Xn = jn)
= P (Xn+1 = jn+1 | Xn = jn) .
3/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
The significance of a Markov chain lies in the fact that if
{Xn+1 = jn+1} is a future event, then the conditional probability of
this event given the past history {X0 = j0, X1 = j1, · · · , Xn = jn}
depends only upon the immediate past {Xn = jn} and not upon the
remote past {X0 = j0, X1 = j1, · · · , Xn−1 = jn−1}.
The Markov chain is homogeneous or time-homogeneous if it also
has the property that, for all i, j ∈ S and m,n ∈ T ,
P (Xn+m = j | Xn = i) = P (Xm = j | X0 = i) .
That is, the probability of changes of the Markov chain in m steps
does not depend on the starting time n.
4/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
Transition probabilities and transition matrices
For a homogeneous Markov chain, we define :
1 The m-step transition probability P
(m)
ij = P (Xn+m = j | Xn = i)
2 The one-step transition probability
Pij := P
(1)
ij = P (Xn+1 = j | Xn = i)
3 The m-step transition matrix
P (m) =

P
(m)
11 P
(m)
12 · · · P (m)1j · · ·
P
(m)
21 P
(m)
22 · · · P (m)2j · · ·
...
...
. . .
...
. . .

|S|×|S|
4 The one-step transition matrix
P := P (1) =
 P11 P12 · · · P1j · · ·P21 P22 · · · P2j · · ·
...
...
. . .
...
. . .

|S|×|S|
(Note that all row sums = 1 !)
One is usually given P , and one’s first task is to find P (m).
5/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
Example 1. Suppose P =
(
0.28 0.72
0.08 0.92
)
. That is,
P (Xn+1 = 1 | Xn = 1) = 0.28, P (Xn+1 = 2 | Xn = 1) = 0.72
P (Xn+1 = 1 | Xn = 2) = 0.08, P (Xn+1 = 2 | Xn = 2) = 0.92
Find P
(2)
12 , P
(2), P
(2)
21 , P
(3)
12 , P
(3), P
(3)
22 .
P
(2)
12 = P (Xn+2 = 2 | Xn = 1)
=
2∑
j=1
P (Xn+1 = j | Xn = 1)P (Xn+2 = 2 | Xn+1 = j)
= 0.28× 0.72 + 0.72× 0.92 = (0.28, 0.72)
(
0.72
0.92
)
= 0.864
It can be seen that P
(2)
12 equals the (1, 2)-th element of
P 2 = P×P =
(
0.28 0.72
0.08 0.92
)(
0.28 0.72
0.08 0.92
)
=
(
0.136 0.864
0.096 0.904
)
.
6/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
Similarly, P
(2)
ij equals the (i, j)-th element of P
2.
Hence P (2) = P 2 and P
(2)
21 = 0.096.
P
(3)
12 = P (Xn+3 = 2 | Xn = 1)
=
2∑
j=1
P (Xn+2 = j | Xn = 1)P (Xn+3 = 2 | Xn+2 = j)
= 0.136× 0.72 + 0.864× 0.92 = (0.136, 0.864)
(
0.72
0.92
)
= 0.8928,
which is the (1, 2)-th element of
P 3 = P 2×P =
(
0.136 0.864
0.096 0.904
)(
0.28 0.72
0.08 0.92
)
=
(
0.1072 0.8928
0.0992 0.9008
)
.
Following the same argument, it can be seen that P (3) = P 3.
7/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
In general we have the following theorem :
Theorem 1
The relationship between the m-step transition matrix P (m) and the
one-step transition matrix P is
P (m) = Pm.
For some Markov chains, it can be proved that limm→∞ Pm exists
and limm→∞ P
(m)
ij = πj , where {πj , j ∈ S} is a probability
distribution which does not depend on i.
limm→∞ P
(m)
ij = πj means that, regardless of the initial state i
of the chain, the probability of being in state j after m steps tends
to πj as m goes to infinity.
8/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
{πj , j ∈ S} is called the equilibrium (or stationary) distribution of
the chain.
It can be shown that, if {πj , j ∈ S} exists, it can be found from the
equation
(π1, π2, · · · )P = (π1, π2, · · · ) , subject to the constraint

j∈S
πj = 1.
The equation means that if X0 has distribution {πj , j ∈ S}, so does
X1 (and hence X2 etc.) as well.
In Example 1, one can verify that (π1, π2) = (0.1, 0.9) is the
solution of the above equation, and one can also verify that
lim
m→∞P
m = lim
m→∞
(
0.28 0.72
0.08 0.92
)m
=
(
0.1 0.9
0.1 0.9
)
.
So the Markov chain eventually moves to state 1 with probability 0.1
and to state 2 with probability 0.9.
9/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
Example 2. Consider a model for road surface deterioration : the state of
the road surface is either 1 = “good”, 2 = “moderate” or 3 = “needs
replacing”. Suppose that the state of the road after n years is described
by a Markov chain model with
P =
 0.9 0.1 00 0.8 0.2
0 0 1
 .
If the road starts out good in year 0, find the probability that it needs
replacement after 5, 10, 15 years.
The state transition diagram for this Markov chain is shown below :
10/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
Calculation by computer gives
P 5 =
 0.59049 0.26281 0.146700 0.32768 0.67232
0 0 1
 ,
P 10 =
 0.34868 0.24130 0.410020 0.10737 0.89263
0 0 1
 , and
P 15 =
 0.20589 0.17071 0.623400 0.03518 0.96482
0 0 1
 .
11/12
MAST20006/MAST90057 – Module 6. Stochastic Processes and Markov Chains
Therefore, the probability that the road needs replacement after 5
years is given by P
(5)
13 = 0.14670.
Similarly, the probability that the road needs replacement after 10
years is 0.41002, and after 15 years is 0.62340.
One can show that
lim
m→∞P
m =
 0 0 10 0 1
0 0 1
 ,
so the road will eventually need replacement.
essay、essay代写