Question

In: Statistics and Probability

Let X and Y be random variables with finite means. Show that min g(x) E(Y−g(X))^2=E(Y−E(Y|X))^2 Hint:...

Let X and Y be random variables with finite means. Show that

min g(x) E(Yg(X))^2=E(YE(Y|X))^2

Hint: ab = ac+cb

Solutions

Expert Solution

Note-if there is any understanding problem regarding this please feel free to ask via comment box..thank you


Related Solutions

Let X and Y be jointly normal random variables with parameters E(X) = E(Y ) =...
Let X and Y be jointly normal random variables with parameters E(X) = E(Y ) = 0, Var(X) = Var(Y ) = 1, and Cor(X, Y ) = ρ. For some nonnegative constants a ≥ 0 and b ≥ 0 with a2 + b2 > 0 define W1 = aX + bY and W2 = bX + aY . (a)Show that Cor(W1, W2) ≥ ρ (b)Assume that ρ = 0.1. Are W1 and W2 independent or not? Why? (c)Assume now...
Let X and Y be continuous random variables with E[X] = E[Y] = 4 and var(X)...
Let X and Y be continuous random variables with E[X] = E[Y] = 4 and var(X) = var(Y) = 10. A new random variable is defined as: W = X+2Y+2. a. Find E[W] and var[W] if X and Y are independent. b. Find E[W] and var[W] if E[XY] = 20. c. If we find that E[XY] = E[X]E[Y], what do we know about the relationship between the random variables X and Y?
Let X and Y be random variables with means µX and µY . The covariance of...
Let X and Y be random variables with means µX and µY . The covariance of X and Y is given by, Cov(X, Y ) = E[(X − µX)(Y − µY )] a) Prove the following three equalities: Cov(X, Y ) = E[(X − µX)Y ] = E[X(Y − µY )] = E(XY ) − µXµY b) Suppose that E(Y |X) = E(Y ). Show that Cov(X, Y ) = 0 (hint: use the law of interated expectations to show...
Let X, Y be independent exponential random variables with mean one. Show that X/(X + Y...
Let X, Y be independent exponential random variables with mean one. Show that X/(X + Y ) is uniformly distributed on [0, 1]. (Please solve it with clear explanations so that I can learn it. I will give thumbs up.)
Let X and Y be two independent random variables, and g : R2 --> R an...
Let X and Y be two independent random variables, and g : R2 --> R an arbitrary bivariate function. 1) Suppose that X and Y are continuous with densities fX and fY . Prove that for any y ? R withfY (y) > 0, the conditional density of the random variable Z = g(X, Y ) given Y = y is the same as the density of the random variable W = g(X, y). 2) Suppose that X and Y...
Let X, Y be independent random variables with X ∼ Uniform([1, 5]) and Y ∼ Uniform([2,...
Let X, Y be independent random variables with X ∼ Uniform([1, 5]) and Y ∼ Uniform([2, 4]). a) FindP(X<Y). b) FindP(X<Y|Y>3) c) FindP(√Y<X<Y).
Let X and Y be two discrete random variables with a common mgf of e^(4t). After...
Let X and Y be two discrete random variables with a common mgf of e^(4t). After some analysis you conclude, X = 4 and Y = 6. Is this a valid claim? Why or why not?
Let X and Y be two discrete random variables with a common mgf of e^(4t). After...
Let X and Y be two discrete random variables with a common mgf of e^(4t). After some analysis you conclude, X = 4 and Y = 6. Is this a valid claim? Why or why not?
Let X and Y be two independent random variables such that X + Y has the...
Let X and Y be two independent random variables such that X + Y has the same density as X. What is Y?
Let X and Y be uniform random variables on [0, 1]. If X and Y are...
Let X and Y be uniform random variables on [0, 1]. If X and Y are independent, find the probability distribution function of X + Y
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT