Question

In: Statistics and Probability

Let X and Y be two independent random variables, and g : R2 --> R an...

Let X and Y be two independent random variables, and g : R2 --> R an arbitrary bivariate function.

1) Suppose that X and Y are continuous with densities fX and fY . Prove that for any y ? R withfY (y) > 0, the conditional density of the random variable Z = g(X, Y ) given Y = y is the same as the density of the random variable W = g(X, y).

2) Suppose that X and Y are discrete with probability mass functions pX and pY . Prove that for anyy ? R with pY (y) > 0, the conditional probability mass function of the random variable Z = g(X, Y ) given Y = y is the same as the probability mass function of the random variable W = g(X, y).

Thanks a lot!

Solutions

Expert Solution


Related Solutions

Let X and Y be two independent random variables such that X + Y has the...
Let X and Y be two independent random variables such that X + Y has the same density as X. What is Y?
Let X and Y be independent Gaussian(0,1) random variables. Define the random variables R and Θ,...
Let X and Y be independent Gaussian(0,1) random variables. Define the random variables R and Θ, by R2=X2+Y2,Θ = tan−1(Y/X).You can think of X and Y as the real and the imaginary part of a signal. Similarly, R2 is its power, Θ is the phase, and R is the magnitude of that signal. (b) Find the probability density functions of R and Θ. Are R and Θ independent random variables?
Let X and Y be two independent random variables. X is a binomial (25,0.4) and Y...
Let X and Y be two independent random variables. X is a binomial (25,0.4) and Y is a uniform (0,6). Let W=2X-Y and Z= 2X+Y. a) Find the expected value of X, the expected value of Y, the variance of X and the variance of Y. b) Find the expected value of W. c) Find the variance of W. d) Find the covariance of Z and W. d) Find the covariance of Z and W.
Let X, Y be independent exponential random variables with mean one. Show that X/(X + Y...
Let X, Y be independent exponential random variables with mean one. Show that X/(X + Y ) is uniformly distributed on [0, 1]. (Please solve it with clear explanations so that I can learn it. I will give thumbs up.)
Let X and Y be two independent and identically distributed random variables with expected value 1...
Let X and Y be two independent and identically distributed random variables with expected value 1 and variance 2.56. (i) Find a non-trivial upper bound for P(|X + Y − 2| ≥ 1). 5 MARKS (ii) Now suppose that X and Y are independent and identically distributed N(1, 2.56) random variables. What is P(|X + Y − 2| ≥ 1) exactly? Briefly, state your reasoning. 2 MARKS (iii) Why is the upper bound you obtained in Part (i) so different...
Let X and Y be independent positive random variables. Let Z=X/Y. In what follows, all occurrences...
Let X and Y be independent positive random variables. Let Z=X/Y. In what follows, all occurrences of x, y, z are assumed to be positive numbers. Suppose that X and Y are discrete, with known PMFs, pX and pY. Then, pZ|Y(z|y)=pX(?). What is the argument in the place of the question mark?    Suppose that X and Y are continuous, with known PDFs, fX and fY. Provide a formula, analogous to the one in part (a), for fZ|Y(z|y) in terms...
Let X, Y be independent random variables with X ∼ Uniform([1, 5]) and Y ∼ Uniform([2,...
Let X, Y be independent random variables with X ∼ Uniform([1, 5]) and Y ∼ Uniform([2, 4]). a) FindP(X<Y). b) FindP(X<Y|Y>3) c) FindP(√Y<X<Y).
Let X,,X, and X, be independent uniform random variables on [0,1] Write Y = X, +X,...
Let X,,X, and X, be independent uniform random variables on [0,1] Write Y = X, +X, and Z = X+ X. a.) Compute E[X,X,X,. (5 points) b.) Compute Var(X). (5 points) c.) Compute and draw a graph of the density function fr. (15 points)
9.8 Let X and Y be independent random variables with probability distributions given by P(X =...
9.8 Let X and Y be independent random variables with probability distributions given by P(X = 0) = P(X = 1) = 1/2 and P(Y = 0) = P(Y = 2) = 1/2 . a. Compute the distribution of Z = X + Y . b. Let Y˜ and Z˜ be independent random variables, where Y˜ has the same distribution as Y , and Z˜ the same distribution as Z. Compute the distribution of X˜ = Z˜ − Y
Let X and Y be independent discrete random variables with the following PDFs: x 0 1...
Let X and Y be independent discrete random variables with the following PDFs: x 0 1 2 f(x)=P[X=x] 0.5 0.3 0.2 y 0 1 2 g(y)= P[Y=y] 0.65 0.25 0.1 (a) Show work to find the PDF h(w) = P[W=w] = (f*g)(w) (the convolution) of W = X + Y (b) Show work to find E[X], E[Y] and E[W] (note that E[W] = E[X]+E[Y])
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT