In: Math
Define a joint distribution for two random variables (X,Y) such that (i) Cov(X,Y)=0 and (ii) E[Y I X] is not equal to E[Y].
How do I define a joint distribution that satisfies both (i) and (ii) above at the same time?
Please give me an example and explanation of how it meets the two conditions.
(i) Cov(X,Y)=0 ( condition for uncorrelated X and Y )
(ii) E[Y I X] is not equal to E[Y]. ( condition for dependence of X and Y )
In this case basically you need an example of dependent yet uncorrelated.
As you must know, independence of variables implies absence correlation but uncorrelated values does not implies independence.
For example,
We can define a discrete random variable
X∈ {−1,0,1} with P(X = −1) = P(X = 0) = P(X = 1) = 1/3
and then define Y={1, if X=0
Y = { 0,otherwise
Y \ X | -1 | 0 | 1 | Total |
1 | 0 | 1/3 | 0 | 1/3 |
0 | 1/3 | 0 | 1/3 | 2/3 |
Cov(X,Y) = E(X,Y) - E(X)*E(Y)
E(X) = 1/3*(-1) + 1/3*(0) + 1/3*(1) = -1/3 + 0 + 1/3 = 0
E(Y) = 1*(1/2) + 0*(1/2) = 1/2 ( in case X is not considered the probability of each outcome is 1/2 )
E(X,Y) = (-1)*1*0 + 0*1*(1/3) + 1*1*0 + (-1)*0*1/3 + 0*0*0 + 1*0*(1/3) = 0
Cov(X,Y) = E(X,Y) - E(X)*E(Y) = 0 - 0*1/2= 0
( uncorrelated )
Now, E(Y) = 1/2
E( Y | X ) = 1*P[ Y = 1 ] + 0*P[ Y = 0 ] = 1*1/3 + 0*2/3 = 1/3
E(Y) = 1*1/2 + 0*1/2 = 1/2
E( Y | X ) != E( Y )
( dependent )
Another example you can take it Y = X^2