In: Statistics and Probability
prove
Given two r.v’s X and Y over the same alphabet:
(a) Show that the Kullback–Leibler distance D(P(X,Y) || P(X) P(Y))
= H(X) – H(X|Y)
(b) Show that the bounds of Mutual Information (MI) are 0 ≤ I(X:Y)
≤ min [ H(X) , H(Y) ]
with equality on the left if and only if X and Y are independent
random variables, and
with equality on the right if and only if either Y essentially
determines X, or X essentially
determines Y , or both.
(c) Show that the MI is symmetric, i.e. I(X;Y) = I(Y:X)