In: Advanced Math
Suppose the joint probability distribution of two binary random variables X and Y are given as follows.
x/y | 1 | 2 |
0 | 3/10 | 0 |
1 | 4/10 | 3/10 |
X goes along side as 0 and 1, Y goes along top as 1 and 2
e) Find joint entropy H(X, Y ).
f) Suppose X and Y are independent. Show that H(X|Y ) = H(X).
g) Suppose X and Y are independent. Show that H(X, Y ) = H(X) + H(Y ).
h) Show that I(X; X) = H(X).
please help with all parts!
thank you!
SOLUTION:
given That joint probability distribution of binary random variable as shown in below
x/y | 1 | 2 | totalP(X=x) |
0 | 0.3 | 0 | 0.3 |
1 | 0.4 | 0.3 | 0.7 |
total=P(Y=y) | 0.7 | 0.3 | 1 |
e) find the joint entropy H(X,Y):
H(X,Y)=H(X)*H(y)
=− Σ p(x).log p(x)*− Σ p(y).log p(y)
H(X,Y)=-[0.3log0.3+0.7log0.3]*-[0.7log20.7+0.3log20.3]
H(X,Y)=0.8813*0.88129=0.7766
f)Suppose X and Y are independent. Show that H(X|Y ) = H(X):
H(X|Y) = − p (x, y)log p(x|y) ] across all x ∈ X and y ∈ Y
= -[0.3log(0.3/0.7)+0log(0/0.3)+0.4log(0.4/0.7)+0.3log(0.3/0.3)]= 0.6897
And
H(X)=− Σ p(x).log p(x)=-[0.3log0.3+0.7log0.3]=0.8813
so
X and Y are independent then H(X|Y ) = H(X)
g) Suppose X and Y are independent. Show that H(X, Y ) = H(X) + H(Y ):
H(X,Y)=H(X)*H(y)
=− Σ p(x).log p(x)*− Σ p(y).log p(y)
H(X,Y)=-[0.3log0.3+0.7log0.3]*-[0.7log20.7+0.3log20.3]
H(X,Y)=0.8813*0.88129=1.762
And
H(X)+H(Y)=− Σ p(x).log p(x)+(− Σ p(y).log p(y))=-[0.3log0.3+0.7log0.3+(-[0.7log20.7+0.3log20.3])
=0.8813+0.88129=1.762
X and Y are independent hence H(X, Y ) = H(X) + H(Y)
h) Show that I(X; X) = H(X):
mutual information I(X; Y )
I(X; Y) = H(X) - H(X|Y) = − Σ p(x).log p(x) - H(X|Y) = -[0.3log0.3+0.7log0.3]-0.6897 = 0.1916
and
H(X)=− Σ p(x).log p(x)=-[0.3log0.3+0.7log0.3]=0.1916
There fore
I(X; X) = H(X)