Let X and Y be jointly normal random variables with parameters
E(X) = E(Y ) = 0, Var(X) = Var(Y ) = 1, and Cor(X, Y ) = ρ. For
some nonnegative constants a ≥ 0 and b ≥ 0 with a2 +
b2 > 0 define W1 = aX + bY and W2 = bX + aY .
(a)Show that Cor(W1, W2) ≥ ρ
(b)Assume that ρ = 0.1. Are W1 and W2 independent or not?
Why?
(c)Assume now...
Let X and Y be continuous random variables with E[X] = E[Y] = 4
and var(X) = var(Y) = 10. A new random variable is defined as: W =
X+2Y+2. a. Find E[W] and var[W] if X and Y are independent. b. Find
E[W] and var[W] if E[XY] = 20. c. If we find that E[XY] = E[X]E[Y],
what do we know about the relationship between the random variables
X and Y?
Let X and Y be random variables with means µX and µY . The
covariance of X and Y is given by, Cov(X, Y ) = E[(X − µX)(Y − µY
)]
a) Prove the following three equalities: Cov(X, Y ) = E[(X −
µX)Y ] = E[X(Y − µY )] = E(XY ) − µXµY
b) Suppose that E(Y |X) = E(Y ). Show that Cov(X, Y ) = 0 (hint:
use the law of interated expectations to show...
Let X, Y be independent exponential random variables with mean
one. Show that X/(X + Y ) is uniformly distributed on [0, 1].
(Please solve it with clear explanations so that I can learn it.
I will give thumbs up.)
Let X and Y be two independent random variables, and g : R2
--> R an arbitrary bivariate function.
1) Suppose that X and Y are continuous with densities fX and fY
. Prove that for any y ? R withfY (y) > 0, the conditional
density of the random variable Z = g(X, Y ) given Y = y is the same
as the density of the random variable W = g(X, y).
2) Suppose that X and Y...
Let X and Y be two discrete random variables with a common mgf
of e^(4t). After some analysis you conclude, X = 4 and Y = 6. Is
this a valid claim? Why or why not?
Let X and Y be two discrete random variables with a common mgf
of e^(4t). After some analysis you conclude, X = 4 and Y = 6. Is
this a valid claim? Why or why not?