MAS8384 Bayesian Methodology
Project 2021/22
Weighting: 60%
Issued: Monday, 21st February 2022, 4pm
Deadline: Monday, 28th February 2022, 4pm
Note 1: Completed work should be submitted via the Canvas assignment submission webpage for this module
(.../41297/assignments). Make sure you submit it in advance of the stated deadline. Details of the late
work policy can be found in the relevant module sub-page: .../41297/pages/assessment-information
Note 2: Alongside your project you should also submit a short video, approximately 3 minutes long, in which
you discuss one part of your project (and following the guidance in class). This video will be pass/fail and
otherwise will not count towards your final mark for the project.
Guidance on tackling the questions and writing-up your results: I strongly advise you to start this
assignment well ahead of the deadline. You can discuss this assignment with others (I encourage this),
however the work you submit must be your own, it must be written in your own words, and any computer
code which is included must be your own original work. It is good practice to acknowledge those who you
discussed the work with. You do not need to write more than a few sides of text with some accompanying
figures and code snippets to score very highly. Clear and concise answers are positively encouraged, and
stylistically this should be close to the solution sheets for the bi-weekly exercises. If you write computer
code in order to answer these questions then include that code as part of your answer. Clearly annotate /
document your code – it’s your responsibility to convey what you are doing.
For this assignment s1, . . . , s8 respectively refer to the characters of your eight character Login ID.
For instance, if your Login ID is b8273645 then s2 = 8, s3 = 2, and s7 = 4.
1. Slice Sampling is a data augmentation technique which samples from a target density f by adding
the additional variable U to the state of interest. It proceeds as follows: Starting with an initial
value X(0) ∈ supp(f), iterate for t = 1, 2, . . .
1. Draw U (t) ∼ U[0, f(X(t−1))].
2. Draw X(t) ∼ U{x : f(x) ≥ U (t)}.
where U[a, b] denotes the uniform distribution on interval [a, b].
(a) Show that the slice sampler is a Gibbs sampler that samples from the uniform distribution on
the set {(x, u) : u ≤ f(x)}. [4 marks]
(b) Deduce from this that the invariant distribution of X(t) has density f . [4 marks]
(c) For the density
f(x) =
3
8
[
I[−5,−3](x)(1− (x+ 4)2) + I[+3,+5](x)(1− (x− 4)2)
]
,
where
I[a,b](x) =
{
1 if x ∈ [a, b],
0 otherwise.
i. What would the two steps of a slice sampler iteration be? State each step explicitly for this
particular distribution. [4 marks]
ii. Implement both a slice sampler and a random walk Metropolis algorithm to sample from f .
Estimate the mean and variance of f using a few thousand iterations of each. You might
wish to consider several proposal scales for the Metropolis algorithm.
[16 marks]
iii. Using simple plots, compare qualitatively the behaviour of the two algorithms. [4 marks]
(d) Slice sampling can be difficult to implement for many real problems. Give one reason why.
Suggest another MCMC algorithm with the same target distribution as the slice sampler which
would not suffer from this problem. [3 marks]
Q1 Total: [35 marks]
2. Consider the following univariate density:
f(x) ∝ exp
{
−|x|
3/2 − (s8 + 10) · |x|3/4
10s7 + s8 + 5
}
.
(a) Write an R function to evaluate this density (up to its normalising constant), and produce an
appropriate plot to visualise it.
[6 marks]
(b) Implement a simulated annealing algorithm to provide an estimate of the global mode of f .
[8 marks]
(c) Implement the following algorithms to sample from f :
i. A Random-Walk Metropolis algorithm (with Gaussian innovation).
ii. A Metropolised Slice Sampler (with Gaussian innovation).
iii. Another Monte Carlo algorithm of your choosing. Provide an indication of why your choice
may be interesting to investigate (it may of course transpire in practice it is not). If the
algorithm wasn’t explicitly covered in this course, then provide a short description of the
methodology and an appropriate reference.
(Hint: In addition to implementing algorithms and providing documented code, I am expecting
evidence that they are implemented correctly by comparison to (a) and other appropriate checks)
[16 marks]
(d) Considering starting the chain at X0 = 0, use a few short runs to optimise the proposal scale
of the algorithms you implemented in (b)i and (b)ii (and (b)iii if necessary). (Hint: Make sure
you provide rationale for your choice)
[8 marks]
(e) For both the scale you found in the previous question, and for σ = 1, compare the performance
of the three algorithms implemented in (b) for two different starting values:
• X0 = 0;
• and X0 = 60s6 + 20s7 + 10s8 + 9.
use them to estimate the expectation of |X| ,and the probability that X > (10s7 + s8 + 5),
when X is a random variable distributed according to f . Explain how you decide on how many
iterations to use, the length of any burn-in period, and how you make any other choices that
you need to make.
[8 marks]
(f) Discuss what you observe and explain why the algorithms behave in this way. (Hint: I’m looking
for a few well chosen observations and comparisons about both the qualitative and quantitative
behaviour of the algorithms you have implemented)
[4 marks]
Q2 Total: [50 marks]
3. A helpful PhD student suggests that you modify the algorithms you implemented in Q2. She suggests
that you replace the Metropolis-Hastings acceptance probability (αM-H) where it appears in your
implementations with a variant she has been considering in her doctoral work, αPhD. She outlines to
you that on the tth iteration of your MCMC algorithm that instead of accepting a proposal X (drawn
from an appropriate instrumental distribution, q) with probability αM-H(X|Xt−1), that instead you
should accept the proposal with probability
αPhD(X|Xt−1) := f(X) · q(Xt−1|X)
f(X) · q(Xt−1|X) + f(Xt−1) · q(X|Xt−1) ∈ [0, 1].
(a) Is this modification valid? (i.e. is the invariant distribution of this Markov chain f?).
[10 marks]
(b) Does this modification offer any advantage over the use of the Metropolis-Hastings acceptance
probability, αM-H? Clearly state whether or not (and why) you would recommend others im-
plement her suggested modification.
[5 marks]
Q3 Total: [15 marks]
Total: [100 marks]