程序代写案例-STAT0007
时间:2022-04-14
Examination Paper for STAT0007 - Level 5 Page 1
STAT0007: INTRODUCTION TO APPLIED
PROBABILITY
2019
Answer ALL questions.
Section A carries 40% of the total marks and Section B carries 60%. The relative weights
attached to each question are A1 (10 marks), A2 (10 marks), A3 (10 marks), A4 (10 marks),
B1 (30 marks), B2 (30 marks).
The numbers in square brackets indicate the relative weights attached to each part question.
You may find the following useful.
Distribution Probability Generating Function
Poisson(λ) G(s) = exp(λ(s− 1))
Geometric(p) G(s) = ps/(1− (1− p)s), for |s| < (1− p)−1
Binomial(n, p) G(s) = (1− p+ ps)n
Turn Over
Examination Paper for STAT0007 - Level 5 Page 2
Section A
A1 Let {Xn;n = 0, 1, 2, ...} be a Markov chain with state space S = {1, 2, 3, 4, 5} and
transition matrix
P =

0.4 0.6 0 0 0
0 0.2 0.4 0.4 0
0 0 0 1 0
0 0.5 0.5 0 0
0 0 1 0 0
 .
(a) Find the irreducible classes of intercommunicating states and classify them in
terms of positive or null recurrence, transience, periodicity and ergodicity. [3]
(b) State, with a reason, whether this Markov chain has:
(i) An invariant distribution. [1]
(ii) An equilibrium distribution. [1]
(c) Calculate P (Xn+1 = 2|Xn = 4, Xn−1 = 3), naming any property you use. [1]
(d) Calculate P (Xn+1 = 2, Xn = 4|Xn−1 = 3), naming any property you use. [2]
(e) Is it true that P (Xn+1 = k|Xn = j,Xn−1 = i) = P (Xn+1 = k,Xn = j|Xn−1 = i)
for all values of i, j and k for this Markov chain? Justify your answer. [2]
A2 A continuous time Markov chain with states {1, 2, 3, 4} has generator matrix
Q =

−2 2 0 0
0 −4 4 0
0 1 −1 0
0 0 1.5 −1.5
 .
(a) Write down the Chapman-Kolmogorov equation for p23(t+ s), for any t, s > 0, in
its most simplified form. [2]
(b) Write down Kolmogorov’s forward equation for p23(t) for any t ≥ 0, omitting any
zero terms from the equation. [3]
The solution to Kolmogorov’s forward equation you gave in part (b) is
p23(t) =
4
5
− 4
5
exp(−5t) .
(c) Hence evaluate:
(i) p23(1); [1]
(ii) p22(1); [1]
(iii) p33(1) (hint: Use part (a)). [3]
Continued
Examination Paper for STAT0007 - Level 5 Page 3
A3 A participant in a game show chooses one of four boxes at random, each with probabil-
ity 1/4. Box 1 contains £5,000 , Box 2 £3,000 , Box 3 £1,000 , and Box 4 nothing. If
she chooses Box 4, the participant leaves the game. Otherwise, she keeps the money in
the box chosen, the box is refilled with the same amount and she repeats the process,
selecting from among the four boxes at random. What is the expected value of the
amount the participant will win by the time she leaves the game? [10]
A4 Let {N(t), t ≥ 0} be an irreducible and positive recurrent continuous time Markov
process with states S = {0, 1, 2, ...}. We are given the probability generating function:
G(s, t) = E[sN(t)|N(0) = 0] = exp
(
1
2
(s− 1)(1− exp(−2t))
)
.
(a) Deduce the distribution of N(t) given that N(0) = 0. [2]
(b) Write down the probability P (N(t) = 0|N(0) = 0). [2]
(c) Explain why the process {N(t), t ≥ 0} has an equilibrium distribution. [3]
(d) What is the long-run proportion of time that the Markov chain will spend in
states {0, 1}? [3]
Turn Over
Examination Paper for STAT0007 - Level 5 Page 4
Section B
B1 In a certain town at time t = 0 there are no bears. Brown bears begin to arrive accord-
ing to a Poisson process of rate β per hour. Grey bears arrive according to a Poisson
process of rate γ per hour, independently of the brown bears.
(a) Name the distribution of the total number of bears in the town at time t > 0
hours, given there are none at time 0. [2]
(b) Find the probability that no new bears will arrive in the town during the time
period t ∈ [1, 2] ∪ [3, 5] hours. [4]
(c) Derive an expression for the probability that the first bear to arrive is brown,
showing your workings and/ or reasoning. [3]
(d) If n bears arrive during the time period [5, 10] hours, what is the distribution of
the number of bears arriving during the period [7, 9] hours? Explain your answer.
[3]
(e) We are now given the extra information that each arriving female bear carries her
offspring with her. Is it still appropriate to use a Poisson process to model the
arrival of bears? Explain your answer. [3]
An isolated species of mammals evolves on an island according to a linear birth-linear
death Markov chain. In particular, we assume that each mammal on the island at
time t has probability µh+ o(h) of dying in the time interval (t, t+h], and probability
λh+ o(h) of giving birth to a baby mammal during (t, t+ h], with all involved events
being independent of each other. You may assume that µ, λ > 0.
Let {Mt; t ≥ 0} denote the number of mammals on the island at time t.
(f) Write down the state space S and the generator matrix Q of {Mt; t ≥ 0}. [3]
(g) Suppose that Mt = k, with k > 0. Find the probability that there will be at
least one death or birth in the time interval (t, t + δ], for δ > 0, showing your
reasoning. [3]
(h) If Mt = 1, state with a reason the expected further time until there are two
mammals on the island for the first time since time t. [2]
(i) Under what condition(s) does the process {Mt; t ≥ 0} have an equilibrium distri-
bution? Assuming that the conditions are met, state the equilibrium distribution.
[2]
(j) At time τ > 0, a comet falls on the island. As a result, the mammals can no
longer give birth and the death rate for each mammal increases from µ to (a+µ)
for some a > 0. If there were m > 0 living mammals at the time of impact τ ,
derive an expression for the expected amount of time until the species becomes
extinct, explaining your answer. [5]
Continued
Examination Paper for STAT0007 - Level 5 Page 5
B2 Let {Xn;n = 0, 1, 2, ...} be a Markov chain with state space S = {1, 2, 3, 4, 5} and
transition matrix
P =

1 0 0 0 0
1/4 1/2 1/4 0 0
0 1/2 0 1/2 0
0 0 0 1/6 5/6
0 0 0 1/2 1/2
 .
(a) The chain is currently in state 2.
(i) State the distribution of the further length of time that the chain stays in
state 2. [1]
(ii) State the distribution of where the chain goes when it leaves state 2. [1]
(iii) Explain which of p
(8)
22 and p
(2)
23 · p(6)32 is at least as big as the other (there is no
need to compute either quantity). [3]
(b) Compute the expected number of visits that the chain makes to state 3, given
that X0 = 2. [4]
(c) Compute the probability that the chain ever reaches state 4, given that X0 = 2.
[4]
(d) Does this Markov chain have an invariant and/ or equilibrium distribution? Why?
[2]
(e) Now consider a modified process, Zn, where Zn = |Xn − 3| and let Z0 = 0. Show
that Zn is not a Markov chain. [10]
(f) State whether each of the following is true, false, or cannot say. Marks will only
be awarded if you provide a suitable reason (which may include an example or
counter-example) for your answer. [5]
(i) For a discrete time Markov chain with state space S = {1, 2, 3, 4, 5, 6} and
equilibrium distribution (1
6
, 0, 1
2
, 1
3
, 0, 0), state 2 may be positive recurrent.
(ii) A discrete time Markov chain with infinite state space will have at least one
null recurrent state.
(iii) It is possible for a discrete time Markov chain to have only transient states.
(iv) All discrete time Markov chains with exactly one closed class have an equi-
librium distribution.
(v) An irreducible aperiodic discrete time Markov chain, for which an invariant
distribution exists, must be positive recurrent.
End of Paper
essay、essay代写