Question

In: Math

Define a joint distribution for two random variables (X,Y) such that (i) Cov(X,Y)=0 and (ii) E[Y...

Define a joint distribution for two random variables (X,Y) such that (i) Cov(X,Y)=0 and (ii) E[Y I X] is not equal to E[Y].

How do I define a joint distribution that satisfies both (i) and (ii) above at the same time?

Please give me an example and explanation of how it meets the two conditions.

Solutions

Expert Solution

(i) Cov(X,Y)=0 ( condition for uncorrelated X and Y )

(ii) E[Y I X] is not equal to E[Y]. ( condition for dependence of X and Y )

In this case basically you need an example of dependent yet uncorrelated.

As you must know, independence of variables implies absence correlation but uncorrelated values does not implies independence.

For example,

We can define a discrete random variable

X∈ {−1,0,1} with P(X = −1) = P(X = 0) = P(X = 1) = 1/3

and then define Y={1, if X=0

Y = { 0,otherwise

Y \ X -1 0 1 Total
1 0 1/3 0 1/3
0 1/3 0 1/3 2/3

Cov(X,Y) = E(X,Y) - E(X)*E(Y)

E(X) = 1/3*(-1) + 1/3*(0) + 1/3*(1) = -1/3 + 0 + 1/3 = 0

E(Y) = 1*(1/2) + 0*(1/2) = 1/2 ( in case X is not considered the probability of each outcome is 1/2 )

E(X,Y) = (-1)*1*0 + 0*1*(1/3) + 1*1*0 + (-1)*0*1/3 + 0*0*0 + 1*0*(1/3) = 0

Cov(X,Y) = E(X,Y) - E(X)*E(Y) = 0 - 0*1/2= 0

( uncorrelated )

Now, E(Y) = 1/2

E( Y | X ) = 1*P[ Y = 1 ] + 0*P[ Y = 0 ] = 1*1/3 + 0*2/3 = 1/3

E(Y) = 1*1/2 + 0*1/2 = 1/2

E( Y | X ) != E( Y )

( dependent )

Another example you can take it Y = X^2


Related Solutions

Provide an example that if the cov(X,Y) = 0, the two random variables, X and Y,...
Provide an example that if the cov(X,Y) = 0, the two random variables, X and Y, are not necessarily independent. Would you please give the example specifically and why?
Provide an example that if the cov(X,Y) = 0, the two random variables, X and Y,...
Provide an example that if the cov(X,Y) = 0, the two random variables, X and Y, are not necessarily independent. Would you please give the example specifically and why?
The table below gives the joint distribution between random variables X and Y . y 0...
The table below gives the joint distribution between random variables X and Y . y 0 2 4 x 0 0.03 0.01 0.20 1 0.15 0.10 0.51 (a). [4pts] Find P(X = 0, Y = 2). 1 Problem 2(b). [6pts] Find E[X]. Problem 2(c). [6pts] Find Cov(X, Y ). Problem 2(d). [6pts] Find P(X = 1|Y = 2).
X and Y are discrete random variables with joint distribution given below Y 1 Y 0...
X and Y are discrete random variables with joint distribution given below Y 1 Y 0 Y 1 X 1 0 1/4 0 X 0 1/4 1/4 1/4 (a) Determine the conditional expectation E Y|X 1 . (b) Determine the conditional expectation E X|Y y for each value of y. (c) Determine the expected value of X using conditional expectation results form part (b) above. (d) Now obatin the marginal distribution of X and verify your answers to part (c).
This is the probability distribution between two random variables X and Y: Y \ X 0...
This is the probability distribution between two random variables X and Y: Y \ X 0 1 2 3 0.1 0.2 0.2 4 0.2 0.2 0.1 a) Are those variables independent? b) What is the marginal probability of X? c) Find E[XY]
Suppose the joint probability distribution of two binary random variables X and Y are given as...
Suppose the joint probability distribution of two binary random variables X and Y are given as follows. x/y 1 2 0 3/10 0 1 4/10 3/10 X goes along side as 0 and 1, Y goes along top as 1 and 2 e) Find joint entropy H(X, Y ). f) Suppose X and Y are independent. Show that H(X|Y ) = H(X). g) Suppose X and Y are independent. Show that H(X, Y ) = H(X) + H(Y ). h)...
let the continuous random variables X and Y have the joint pdf: f(x,y)=6x , 0<x<y<1 i)...
let the continuous random variables X and Y have the joint pdf: f(x,y)=6x , 0<x<y<1 i) find the marginal pdf of X and Y respectively, ii) the conditional pdf of Y given x, that is fY|X(y|x), iii) E(Y|x) and Corr(X,Y).
.The following table displays the joint probability distribution of two discrete random variables X and Y....
.The following table displays the joint probability distribution of two discrete random variables X and Y. -1 0 1 2 1 0.2 0 0.16 0.12 0 0.3 0.12 0.1 0 What is P(X=1/Y=1)?    What is the value of E(X/Y=1)?    What is the value of VAR(X/Y = 1)? What is the correlation between X and Y? What is variance of W = 4X - 2Y. What is covariance between X and W?
The joint probability distribution of random variables, X and Y, is shown in the following table:...
The joint probability distribution of random variables, X and Y, is shown in the following table: X 2 4 6 Y 1 0.10 0.20 0.08 2 0.06 0.12 0.16 3 0.15 0.04 0.09 (a) Calculate P ( X=4 | Y=1) (b) Calculate V (Y | X=2) . (c) Calculate V (3Y-X ) .
Let X and Y be two discrete random variables whose joint probability distribution function is given...
Let X and Y be two discrete random variables whose joint probability distribution function is given by f(x, y) = 1 12 x 2 + y for x = −1, 1 and y = 0, 1, 2 a) Find cov(X, Y ). b) Determine whether X and Y are independent or not. (Justify your answer.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT