In: Statistics and Probability
Provide an example that if the cov(X,Y) = 0, the two random variables, X and Y, are not necessarily independent.
Would you please give the example specifically and why?
Let X take on the values 0 and 1, and P[X=-1]=P[X=1]=1/2
I will choose a random variable Y so that the distribution of Y
itself is dependent.
Let Y take the values -1,0 and 1. However, I will tweak the probabilities in the following way.
P[Y=-1|X=-1]=P[Y=1|X=-1]=1/2
P[Y=0|X=1]=1
I constructed the distribution of Y to dependent on X.
Now let's calculate E[XY]. The possible values of XY are either -1,0 or -1
When XY=1, both X=Y=1 or X=Y=-1. If X=1, Y=0, since P[Y=0|X=1]=1, so we discard the possibility X=Y=1
P[Y=-1|X=-1]=P[X=-1,Y=-1]/P[X=-1]=1/2
Therefore, P[X=-1,Y=-1]=1/2 X 1/2=1/4
When XY=0, we must have that either one of X or Y is 0 or both are 0. The possibilities are (X=-1,Y=0), (X=1,Y=0)
But Y=0 only when X=1. So, P[XY=0]=P[X=1,Y=0]=P[Y=0|X=1] X P[X=1]=1 X 1/2= 1/2
When XY=-1, X=-1, andY=1, or X=1, and Y=-1. But X=1 and Y=-1 is not possible since P[Y=0|X=1]=1
P[Y=-1|X=-1]=P[X=-1,Y=-1]/P[X=-1]=1/2
Therefore, P[X=-1,Y=-1]=1/2X 1/2=1/4
In summary, P[XY=1]=1/4, P[XY=0]=1/2, P[XY=-1]=1/4
Thus, E[XY]=1 x 1/4 + 0 X 1/2 + (-1) X1/4=0
E[X]=1 X 1/2 + (-1) X 1/2=0
At this point we don't even need to calculate E[Y], because E[X]E[Y]=0 anyway since E[X]=0.
Therefore Cov(X,Y)=E[XY]-E[X]E[Y]=0
But, X and Y are not independent. Why?
Consider P[X=1|Y=1]. If they are independent this should equal P[X=1]=1/2.
But,
But, from how we defined Y, P[Y=0|X=1]=1, so this readily implies that P[Y=1|X=1]=0
Thus, P[X=1|Y=1]=0 and this doesn't equal P[X=1]. Hence, X and Y are not independent.