In: Statistics and Probability
PART A: Suppose X and Y are independent. Show that H(X|Y) = H(X) . (H represents entropy i think)
PART B: Suppose X and Y are independent. Show that H(X,Y) = H(X) + H(Y)
PART C: Prove that the mutual information is symmetric, i.e., I(X,Y) = I(Y,X) and xi∈X, yi∈Y
ANSWER::
Here part A,B ,C
(OR) TRY THIS
PART A:
Suppose X and Y are independent. Show that H(X|Y) = H(X) .
PART B:
Suppose X and Y are independent. Show that H(X,Y) = H(X) + H(Y)
NOTE:: I HOPE THIS ANSWER IS HELPFULL TO YOU......**PLEASE SUPPORT ME WITH YOUR RATING......
**PLEASE GIVE ME "LIKE".....ITS VERY IMPORTANT FOR,ME......PLEASE SUPPORT ME .......THANK YOU