COMP2610/COMP6261-英语代写-Assignment 3
时间:2023-10-19
THE AUSTRALIAN NATIONAL UNIVERSITY
Assignment 3
COMP2610/COMP6261
Information Theory, Semester 2 2023
Release Date: Tuesday 26 September 2023
Due Date: Monday 23 October 2023, 9:00 am
Cut-off Date: Friday 28 October 2023, 5:00 pm
No submission allowed after Friday 28 October 2023, 5:00 pm.
Assignment 3 weighting is 20% of the course mark.
Instructions:
Marks:
• The mark for each question is indicated next to the question. For questions where you are
asked to prove results, if you can not prove a precedent part, you can still attempt subsequent
parts of the question assuming the truth of the earlier part.
• For all students: Answer all Questions. You will be marked out of 100.
Submission:
• Submit your assignment together with a cover page as a single PDF on Wattle.
• Submission deadlines will be strictly enforced. A late submission attracts a penalty of 5%
per working day. If you submit after the cut-off date, you get zero marks (100% penalty).
Extensions will be considered according to the ANU Policy.
• All assignments must be done individually. Plagiarism is a university offence and will be dealt
with according to university procedures http://academichonesty.anu.edu.au/UniPolicy.html.
1
Question 1: Entropy and Joint Entropy [10 marks total]
An ordinary deck of cards containing 13 clubs, 13 diamonds, 13 hearts, and 13 spades cards is
shuffled and dealt out one card at time without replacement. Let Xi be the suit of the ith card.
(a) Determine H(X1). [4 marks]
(b) Determine H(X1,X2, · · · ,X52). [6 marks]
2
Question 2: Source Coding [30 marks total]
Question 2-I [12 marks total]
Construct a binary Huffman code and Shannon-Fano-Elias code for the following distribution on 5
symbols p= (0.3,0.3,0.2,0.1,0.1). What is the average length of these codes?
Question 2-II [12 marks total]
Consider the random variable
X =
(
x1 x2 x3 x4 x5 x6 x7
0.49 0.26 0.12 0.04 0.04 0.03 0.02
)
(a) Find a binary Huffman code for X . [4 marks]
(b) Find H(X) and the expected codelength for this encoding. [2 marks]
(c) Find a ternary Huffman code for X . [4 marks]
Question 2-III [6 marks total]
A random variable X takes on three values, e.g., a, b, and c, with probabilities 0.55, 0.25, and 0.2.
(a) What are the lengths of the binary Huffman codewords for X? What are the lengths of the binary
Shannon codewords for X? [3 marks]
(b) What is the smallest integer D such that the expected Shannon codeword length with a D-ary
alphabet equals the expected Huffman codeword length with a D-ary alphabet? [3 marks]
3
Question 3: Channel Capacity [30 marks total]
Question 3-I [15 marks total]
There is a discrete memoryless channel (DMC) with the channel input X ∈ X = {1,2,3,4}. The
channel output Y follows the following probabilistic rule.
Y =
{
X probability 12
2X probability 12
Answer the following questions.
(a) Draw the schematic of the channel and clearly show possible channel outputs and the channel
transition probabilities. [5 marks]
(b) Write the mutual information I(X ;Y ) as a function of the most general input probability distri-
bution. [5 marks]
(c) Find a way of using only a subset of the channel inputs such that the maximum mutual informa-
tion (you need to quantify its value) can be achieved with zero error. [5 marks]
Question 3-II [10 marks total]
The Z-channel has binary input and output alphabets and transition probabilities p(y|x) given by the
following matrix:
p(y|x) =
[
1 0
1/3 2/3
]
x,y ∈ {0,1}
Find the capacity of the Z-channel and the maximizing input probability distribution.
Question 3-III [5 marks total]
Consider a binary symmetric channel with Yi = Xi⊕ Zi, where ⊕ is mod 2 addition, and Xi,Yi ∈
{0,1}. Suppose that Zi has constant marginal probabilities Pr(Zi = 0) = p, but that Z1,Z2, · · · ,Zn
are not necessarily independent. Assume that Zi is independent of the input X j, for ∀i, j. Let C =
1−H(p,1− p), Show that
max
p(x1,x2,··· ,xn)
I(X1,X2, · · · ,Xn;Y1,Y2, · · · ,Yn)≥ nC.
4
Question 4: Joint Typical Sequences [30 marks total]
Question 4-I [15 marks total]
Let (xn,yn,zn) be drawn according to the joint distribution p(x,y,z) in an independent and identically
distributed (i.i.d.) manner. We say that (xn,yn,zn) is jointly ε-typical if all the following conditions
are met
• |H˜(xn)−H(X)| ≤ ε
• |H˜(yn)−H(Y )| ≤ ε
• |H˜(zn)−H(Z)| ≤ ε
• |H˜(xn,yn)−H(X ,Y )| ≤ ε
• |H˜(xn,zn)−H(X ,Z)| ≤ ε
• |H˜(yn,zn)−H(Y,Z)| ≤ ε
• |H˜(xn,yn,zn)−H(X ,Y,Z)| ≤ ε
where H˜(xn) =−1n log2(p(xn)). Now suppose that (x˜n, y˜n, z˜n) is drawn i.i.d. according to p(x), p(y),
and p(z). Therefore, (x˜n, y˜n, z˜n) have the same marginals as p(xn,yn,zn), but are independent. Find
upper and lower bounds on the probability that (x˜n, y˜n, z˜n) is jointly typical in terms of H(X ,Y,Z),
H(X), H(Y ), H(Z), ε, and n.
Question 4-II [15 marks total]
Let p= [0.4,0.33,0.27] be the distribution of a random variable X that takes symbols from {a,b,c},
respectively.
(a) Find the empirical entropy of the i.i.d. sequence
x= aabaabbcabaccab
[5 marks]
(Hints: the empirical entropy H˜(xn) =−1n log2(p(xn)).)
(b) Find whether it is a ε-typical sequence with ε= 0.05 [5 marks]
(c) Now assume the following joint probability distribution between X and Y that take symbols from
{a,b,c} and {d,e, f} respectively.
p(x,y) =
 0.2 0.1 0.10.08 0.1 0.15
0.15 0.05 0.07

where in each row, x is fixed. We observe two i.i.d. sequences
x= aabaabbcabaccab
y= dee f de f dd f dd f f d
Determine whether (x,y) are jointly ε-typical. [5 marks]
essay、essay代写