手写代写-6CCM314A
时间:2021-05-18
6CCM314A King’s College London University Of London This paper is part of an examination of the College counting towards the award of a degree. Examinations are governed by the College Regulations under the authority of the Academic Board. FOLLOW the instructions you have been given on how to upload your solutions BSc and MSci Examination 6CCM314A Mathematical Aspects of Statistical Mechan- ics Summer 2020 Time Allowed: Two Hours This paper consists of two sections, Section A and Section B. Section A contributes half the total marks for the paper. Answer all questions in Section A. All questions in Section B carry equal marks, but if more than TWO questions are attempted, then only the best two will count. You may consult lecture notes. 2020 c©King’s College London 6CCM314A SECTION A A1. Consider a random variable X taking values x ∈ AX in the alphabet AX . Denote the probability for each outcome by P (x) and the size of the alphabet by |AX |. You can assume the validity of Gibbs’ inequality without proof. (a) [2 points] State the definition of the entropy H(X). (b) [2 points] Let P (x) = 1/|AX | for all x ∈ AX . Without calculations, state the value of H(X). LetX be a geometric random variable defined on the alphabetAX = {0, 1, 2, 3, . . .}, where the probability of each outcome is P (x) = P(X = x) = t(1− t)x , with 0 < t ≤ 1. The entropy of X is H(X) = − log2 t− log2(1− t) [ 1− t t ] . (c) [4 points] Show that the average of X is 〈X〉P = (1− t)/t. (d) [12 points] Consider another random variable Y , defined on the same alphabet as X and whose probability of each outcome is denoted by Q(y). Assume that 〈X〉P = 〈Y 〉Q. Show that H(Y ) ≤ H(X). Hint: Compute the KL-divergence between Q and P . - 2 - See Next Page 6CCM314A A2. Consider a joint ensemble of random variables (X, Y ), with X taking values x ∈ AX in the alphabet AX and similarly y ∈ AY . Denote the probability for each joint outcome (x, y) as PX,Y (x, y) and the marginal probabilities PX(x) and PY (y). Denote by H(X) and H(Y ) the individual entropies of X and Y , respectively, and by H(X, Y ) the joint entropy of X and Y . (a) [2 points] Prove that H(X, Y ) = H(X)+H(Y ) if X and Y are statistically independent. (b) [8 points] State Jensen’s inequality for a convex function f(x). Further- more, prove it for the simplest case of an alphabet of two outcomes, AX = {a1, a2}, with P(X = a1) = p1 and P(X = a2) = p2. (c) [2 points] State the two chain rules for the entropies, expressing H(X, Y ) in terms of the conditional entropies H(X|Y ) and H(Y |X), respectively. (d) [8 points] Let Y be a function of the discrete random variable X, i.e. Y = g(X), for a Borel measurable real function g. Show that H(Y ) ≤ H(X). From the proof, deduce when the equality holds. Hint: Compute H(X, Y ) in two different ways, using the two chain rules given in (c). - 3 - See Next Page 6CCM314A A3. The penalty shootout method is used to determine a winner in football matches that would have otherwise been drawn or tied. Let X be the number of goals scored by a team during a shootout session in an international tournament. This is a random variable taking values in the alphabet AX = {0, 1, 2, 3, 4, 5}, and let pk = P(X = k) be the probability that X takes the value k. Based on extensive empirical data, the bookmakers are confident that they can reliably estimate the following two quantities G1 = p0 + p5 , (1) G2 = 〈X〉 . (2) Let S be the entropy of X in natural log units. (a) [7 points] By finding the minimum of a suitably defined Lagrangian L, de- termine the probability distribution {p0, . . . , p5} (depending on Lagrange mul- tipliers λ1,2) that maximizes S subject to the constraints in Equations (1) and (2) . (b) [2 points] Express the free energy F in terms of the Lagrange multipliers λ1,2. (c) [7 points] Compute S[{pk}], the entropy in natural log units of the maxent distribution {p0, . . . , p5}, as a function of λ1,2. (d) [4 points] Assume now λ2 = 0. Discuss what happens to the maxent dis- tribution {p0, . . . , p5} in the cases λ1 → −∞, λ1 → +∞ and λ1 → 0, relating your answer to the values taken by S[{pk}] in each of these limits. - 4 - See Next Page Section B 6CCM314A SECTION B B4. Consider a system of N voters, where each voter can cast a vote for the positive or negative party, or abstain (σi ∈ {−1, 1, 0}). Denote the voting pattern by σ = (σ1, . . . , σN) and the magnetisation by M(σ) = ∑N i=1 σi. The maxent distribution of voting patterns, given a constraint on the average 〈M(σ)〉 alone, is P (σ) = Z−1 exp(hM(σ)), where h is a Lagrange multiplier and Z is the partition function. (a) [3 points] Compute the partition function Z as a function of h and N . (b) [1 point] Compute the free energy F . (c) [2 points] Compute the average magnetisation 〈M(σ)〉 as a function of h and N . (d) [2 points] Compute the entropy S[P ] in natural log units of P (σ). Let M0(σ) = ∑N i=1(1− σ2i ) represent the number of abstentions in the pattern σ. You may use the binomial expansion formula (a+b)n = ∑n k=0 ( n k ) akbn−k and the symmetry of binomial coefficients ( n k ) = ( n n−k ) , for 0 ≤ k ≤ n. (e) [17 points] Using the integral representation of the Kronecker delta δa,b = ∫ 2pi 0 dξ 2pi eiξ(a−b) for a, b integers, compute the probability distribution of the number of absten- tions P (M0) = Prob[M0(σ) = M0] = ∑ σ P (σ)δM0,M0(σ) , and check that it is correctly normalised on the alphabet M0 ∈ {0, 1, . . . , N}. (f) [5 points] Compute the limits of P (M0) for h → +∞ and h → −∞ and provide an intuitive explanation of the results. - 5 - See Next Page Section B 6CCM314A B5. Let IN = ∫ b a dx w(x)e−Ng(x) be a convergent integral, dependent on a real parameter N . (a) [8 points] State under which assumptions on g the Laplace’s approxima- tion for IN as N →∞ holds, and prove it. You may use the Gaussian integral formula ∫∞ −∞ dx e −α(x−x0)2 = √ pi/α for α > 0 without proof. Consider now a system of N two-party voters σi ∈ {−1, 1}. Denote the voting pattern by σ = (σ1, . . . , σN) and the voter alignment by E(σ) = (1/N) ∑ i



























































































































































The maxent distribution of voting patterns, given a constraint on the average
〈E(σ)〉 alone, is P (σ) = Z−1 exp(JE(σ)), where J is a Lagrange multiplier and
Z is the partition function. You may use that E(σ) = [M2(σ)/(2N)] − 1/2,
where M(σ) =
∑N
i=1 σi is the magnetisation.
(b) [9 points] Use the identity
∫∞
−∞ dx e
−x2/(2τ2)+ax/

2piτ 2 = ea
2τ2/2 to show
that the partition function can be written as
Z =
e−J/2√
2piJ/N
∫ ∞
−∞
dx e−Ng(x,J) ,
where g(x, J) = x2/(2J)− ln[2 cosh(x)].
(c) [9 points] Applying Laplace’s approximation to the integral over x for large
N , compute the free energy per voter f(J) = limN→∞ F/N , where F = − lnZ.
(d) [4 points] The second derivative ∂2g/∂x2 can be written as ∂2g/∂x2 =
1/J − φ(x), where φ(x) does not depend on J . Compute φ(x) and sketch its
graph as a function of x. By considering the sign of ∂2g/∂x2, find the value of J
above which x = 0 is no longer the argmin of g, signaling the onset of collective
behaviour in this system.
- 6 - See Next Page
Section B 6CCM314A
B6. Consider a one-dimensional Ising chain with N spins taking values σi ∈ {−1, 1}.
The probability of a configuration σ = (σ1, . . . , σN) is given by the Boltzmann
weight
P (σ) = exp(−βH(σ))/Z ,
where β is the inverse temperature and Z is the partition function. The total
energy H(σ) is given by
H(σ) = −J
N∑
i=1
σiσi+1 − h
N∑
i=1
σi ,
where h ∈ R is the external field and J > 0 is the coupling constant. We assume
periodic boundary conditions, so σN+1 ≡ σ1.
(a) [15 points] Compute the partition function Z as a function of β, h, J,N ,
using a transfer matrix method.
(b) [3 points] Compute the free energy per spin f = limN→∞ F/N , where
F = −(1/β) lnZ.
(c) [6 points] Show that the average total energy can be written as 〈H(σ)〉 =

∂β
(βF ) .
(d) [6 points] Prove the relation
F = 〈H(σ)〉 − 1
β
S[P ] ,
where S[P ] is the entropy in natural log units of the Boltzmann distribution
P (σ).
- 7 - Final Page

学霸联盟


essay、essay代写