1996 Paper 8 Question 11

1996 Paper 8 Question 11
1996 Paper 8 Question 11
Information Theory and Coding
Let X and Y represent random variables with associated probability distributions
p(x) and p(y), respectively. They are not independent. Their conditional
probability distributions are p(x|y) and p(y|x), and their joint probability
distribution is p(x, y).
(a) What is the marginal entropy H(X) of variable X, and what is the mutual
information of X with itself?
[4 marks]
(b) In terms of the probability distributions, what are the conditional entropies
H(X|Y ) and H(Y |X)?
[4 marks]
(c) What is the joint entropy H(X, Y ), and what would it be if the random
variables X and Y were independent?
[4 marks]
(d ) Give an alternative expression for H(Y )−H(Y |X) in terms of the joint entropy
and both marginal entropies.
[4 marks]
(e) What is the mutual information I(X; Y )?
1
[4 marks]
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertisement