Question

In: Advanced Math

Suppose the joint probability distribution of two binary random variables X and Y are given as...

Suppose the joint probability distribution of two binary random variables X and Y are given as follows.

x/y 1 2
0 3/10 0
1 4/10 3/10

X goes along side as 0 and 1, Y goes along top as 1 and 2

e) Find joint entropy H(X, Y ).

f) Suppose X and Y are independent. Show that H(X|Y ) = H(X).

g) Suppose X and Y are independent. Show that H(X, Y ) = H(X) + H(Y ).

h) Show that I(X; X) = H(X).

please help with all parts!

thank you!

Solutions

Expert Solution

SOLUTION:

  given That joint probability distribution of binary random variable as shown in below

x/y 1 2 totalP(X=x)
0 0.3 0 0.3
1 0.4 0.3 0.7
total=P(Y=y) 0.7 0.3 1

e) find the joint entropy H(X,Y):

H(X,Y)=H(X)*H(y)

=− Σ p(x).log p(x)*− Σ p(y).log p(y)

H(X,Y)=-[0.3log0.3+0.7log0.3]*-[0.7log20.7+0.3log20.3]

H(X,Y)=0.8813*0.88129=0.7766

f)Suppose X and Y are independent. Show that H(X|Y ) = H(X):

H(X|Y) = − p (x, y)log p(x|y) ] across all x ∈ X and y ∈ Y

= -[0.3log(0.3/0.7)+0log(0/0.3)+0.4log(0.4/0.7)+0.3log(0.3/0.3)]= 0.6897

And

H(X)=− Σ p(x).log p(x)=-[0.3log0.3+0.7log0.3]=0.8813

so

X and Y are independent then H(X|Y ) = H(X)

g) Suppose X and Y are independent. Show that H(X, Y ) = H(X) + H(Y ):

H(X,Y)=H(X)*H(y)

=− Σ p(x).log p(x)*− Σ p(y).log p(y)

H(X,Y)=-[0.3log0.3+0.7log0.3]*-[0.7log20.7+0.3log20.3]

H(X,Y)=0.8813*0.88129=1.762

And

H(X)+H(Y)=− Σ p(x).log p(x)+(− Σ p(y).log p(y))=-[0.3log0.3+0.7log0.3+(-[0.7log20.7+0.3log20.3])

=0.8813+0.88129=1.762

X and Y are independent hence H(X, Y ) = H(X) + H(Y)

h) Show that I(X; X) = H(X):

mutual information I(X; Y )

I(X; Y) = H(X) - H(X|Y) = − Σ p(x).log p(x) - H(X|Y) = -[0.3log0.3+0.7log0.3]-0.6897 = 0.1916

and

H(X)=− Σ p(x).log p(x)=-[0.3log0.3+0.7log0.3]=0.1916

There fore

  I(X; X) = H(X)


Related Solutions

Let X and Y be two discrete random variables whose joint probability distribution function is given...
Let X and Y be two discrete random variables whose joint probability distribution function is given by f(x, y) = 1 12 x 2 + y for x = −1, 1 and y = 0, 1, 2 a) Find cov(X, Y ). b) Determine whether X and Y are independent or not. (Justify your answer.
The joint probability density function for two random variables X and Y is given as, fx,y...
The joint probability density function for two random variables X and Y is given as, fx,y (x, y) = (2/3)(1 + 2xy3 ), 0 < x < 1, 0 < y < 1 (a) Find the marginal probability density functions for X and Y . (b) Are X and Y independent? Justify your answer. (c) Show that E[X] = 4/9 and E[Y ] = 7/15 . (d) Calculate Cov(X, Y )
The joint probability density function (PDF) of two random variables (X,Y) is given by ???(?,?) =...
The joint probability density function (PDF) of two random variables (X,Y) is given by ???(?,?) = { 1, 0 ≤ ? ≤ 2,0 ≤ ? ≤ 1,2? ≤ ? 0, otherwise 1) Find the correlation coefficient ??? between the two random variables X and Y Find the probability P(Y>X/2). help please asap
If the joint probability density function of the random variables X and Y is given by...
If the joint probability density function of the random variables X and Y is given by f(x, y) = (1/4)(x + 2y) for 0 < x < 2, 0 < y < 1, 0 elsewhere (a) Find the conditional density of Y given X = x, and use it to evaluate P (X + Y/2 ≥ 1 | X = 1/2) (b) Find the conditional mean and the conditional variance of Y given X = 1/2 (c) Find the variance...
.The following table displays the joint probability distribution of two discrete random variables X and Y....
.The following table displays the joint probability distribution of two discrete random variables X and Y. -1 0 1 2 1 0.2 0 0.16 0.12 0 0.3 0.12 0.1 0 What is P(X=1/Y=1)?    What is the value of E(X/Y=1)?    What is the value of VAR(X/Y = 1)? What is the correlation between X and Y? What is variance of W = 4X - 2Y. What is covariance between X and W?
The joint probability distribution of random variables, X and Y, is shown in the following table:...
The joint probability distribution of random variables, X and Y, is shown in the following table: X 2 4 6 Y 1 0.10 0.20 0.08 2 0.06 0.12 0.16 3 0.15 0.04 0.09 (a) Calculate P ( X=4 | Y=1) (b) Calculate V (Y | X=2) . (c) Calculate V (3Y-X ) .
Suppose X and Y are {0, 1}-valued random variables with joint probability mass function given in...
Suppose X and Y are {0, 1}-valued random variables with joint probability mass function given in the table below p(x, y) y 0 1 x 0 0.3 0.2 1 0.1 0.4 a. Determine E(Y |X = x) and var(Y |X = x). b. Use part a) to find E(Y ), and compare with the result using the marginal distribution of Y . c. Use part a) to find var(Y ), and compare with the result using the marginal distribution of...
This is the probability distribution between two random variables X and Y: Y \ X 0...
This is the probability distribution between two random variables X and Y: Y \ X 0 1 2 3 0.1 0.2 0.2 4 0.2 0.2 0.1 a) Are those variables independent? b) What is the marginal probability of X? c) Find E[XY]
Suppose that we have two random variables (Y,X) with joint probability density function f(y,x). Consider the...
Suppose that we have two random variables (Y,X) with joint probability density function f(y,x). Consider the following estimator of the intercept of the Best Linear Predictor: A = ?̅ - B • ?̅ , where ?̅ is the sample mean of y, ?̅ is the sample mean of x, and B is the sample covariance of Y and X divided by the sample variance of X. Identify the probability limit of A (if any). For each step in your derivation,...
X and Y are discrete random variables with joint distribution given below Y 1 Y 0...
X and Y are discrete random variables with joint distribution given below Y 1 Y 0 Y 1 X 1 0 1/4 0 X 0 1/4 1/4 1/4 (a) Determine the conditional expectation E Y|X 1 . (b) Determine the conditional expectation E X|Y y for each value of y. (c) Determine the expected value of X using conditional expectation results form part (b) above. (d) Now obatin the marginal distribution of X and verify your answers to part (c).
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT