Question

In: Math

Consider two random variables X and Y, with Y = (a+bX) - Find E(Y) - Find...

Consider two random variables X and Y, with Y = (a+bX)

- Find E(Y)

- Find Cov(X,Y)

- Find Corr(X,Y)

Solutions

Expert Solution

let the expectation and the variance of X be    that is,

now

[since, E(u+v)=E(u)+E(v)]

[since the expectation of a constant is the constant itself and E(bX)=bE(X) when b is a constant]

now,

now,

now,


Related Solutions

Let X and Y be continuous random variables with E[X] = E[Y] = 4 and var(X)...
Let X and Y be continuous random variables with E[X] = E[Y] = 4 and var(X) = var(Y) = 10. A new random variable is defined as: W = X+2Y+2. a. Find E[W] and var[W] if X and Y are independent. b. Find E[W] and var[W] if E[XY] = 20. c. If we find that E[XY] = E[X]E[Y], what do we know about the relationship between the random variables X and Y?
Let X and Y be jointly normal random variables with parameters E(X) = E(Y ) =...
Let X and Y be jointly normal random variables with parameters E(X) = E(Y ) = 0, Var(X) = Var(Y ) = 1, and Cor(X, Y ) = ρ. For some nonnegative constants a ≥ 0 and b ≥ 0 with a2 + b2 > 0 define W1 = aX + bY and W2 = bX + aY . (a)Show that Cor(W1, W2) ≥ ρ (b)Assume that ρ = 0.1. Are W1 and W2 independent or not? Why? (c)Assume now...
Question 1 The random variables X & Y are independent. E[X] = 5, E[Y] = 7,...
Question 1 The random variables X & Y are independent. E[X] = 5, E[Y] = 7, Var[X] = 4, Var[Y] = 6. Calculate following expectation and variance. ( ?[?] = ? ,???[?]=?2) ?? (a) E[5X+6] (b) E[6Y+3] (c) E[X-Y+11] (d) Var[3X+10000] (e) Var[-4Y-1234567] (f) Var[2X-3Y]
Question 1: Two jointly distributed discrete random variables Consider two discrete random variables X, Y ,...
Question 1: Two jointly distributed discrete random variables Consider two discrete random variables X, Y , taking values in {0, 1, 2, 3} each (for a total of 16 possible points). Their joint probability mass function is given by fXY (x, y) = c · x+1 y+1 . Answer the following questions. (a) What is c? (b) What is the probability that X = Y ? (c) Derive the marginal probability mass functions for both X and Y . (d)...
Define a joint distribution for two random variables (X,Y) such that (i) Cov(X,Y)=0 and (ii) E[Y...
Define a joint distribution for two random variables (X,Y) such that (i) Cov(X,Y)=0 and (ii) E[Y I X] is not equal to E[Y]. How do I define a joint distribution that satisfies both (i) and (ii) above at the same time? Please give me an example and explanation of how it meets the two conditions.
Consider the two dependent discrete random variables X and Y . The variable X takes values...
Consider the two dependent discrete random variables X and Y . The variable X takes values in {−1, 1} while Y takes values in {1, 2, 3}. We observe that P(Y =1|X=−1)=1/6 P(Y =2|X=−1)=1/2 P(Y =1|X=1)=1/2 P(Y =2|X=1)=1/4 P(X = 1) = 2/ 5 (a) Find the marginal probability mass function (pmf) of Y . (b) Sketch the cumulative distribution function (cdf) of Y . (c) Compute the expected value E(Y ) of Y . (d) Compute the conditional expectation...
Let X and Y be two discrete random variables with a common mgf of e^(4t). After...
Let X and Y be two discrete random variables with a common mgf of e^(4t). After some analysis you conclude, X = 4 and Y = 6. Is this a valid claim? Why or why not?
Let X and Y be two discrete random variables with a common mgf of e^(4t). After...
Let X and Y be two discrete random variables with a common mgf of e^(4t). After some analysis you conclude, X = 4 and Y = 6. Is this a valid claim? Why or why not?
In this problem there are two random variables X and Y. The random variable Y counts...
In this problem there are two random variables X and Y. The random variable Y counts how many times we roll the die in the following experiment: First, we flip a fair coin. If it comes Heads we set X= 1 and roll a fair die until we get a six. If it comes Tails, we set X= 0 and roll the die until we get an even number (2, 4 or 6). a). What are the possible values taken...
Let X and Y be random variables with finite means. Show that min g(x) E(Y−g(X))^2=E(Y−E(Y|X))^2 Hint:...
Let X and Y be random variables with finite means. Show that min g(x) E(Y−g(X))^2=E(Y−E(Y|X))^2 Hint: a−b = a−c+c−b
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT