Question

In: Statistics and Probability

Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0...

Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0 < p < 1. (a) (6pts) Find P(X > Y ). (b) (8pts) Find P(X + Y = n) and P(X = k∣X + Y = n), for n = 2, 3, ..., and k = 1, 2, ..., n − 1

Solutions

Expert Solution

We are given here the distributions as:

a) The probability here is computed as:

This is the required probability here.

b) The probability here is computed as:

This is the required probability here.

Now the conditional probability here is computed as: (Using Bayes theorem)

This is the required probability here.


Related Solutions

Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0<p<1....
Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0<p<1. (a) Find P(X>Y). (b) Find P(X+Y=n) and P(X=k|X+Y=n), for n=2,3,..., and k=1,2,...,n−1. I need an answer asap please. Thank you.
Let X and Y be independent geometric random variables with parameter p. Find the pmf of...
Let X and Y be independent geometric random variables with parameter p. Find the pmf of X + Y
Let X and Y be random variables. Suppose P(X = 0, Y = 0) = .1,...
Let X and Y be random variables. Suppose P(X = 0, Y = 0) = .1, P(X = 1, Y = 0) = .3, P(X = 2, Y = 0) = .2 P(X = 0, Y = 1) = .2, P(X = 1, Y = 1) = .2, P(X = 2, Y = 1) = 0. a. Determine E(X) and E(Y ). b. Find Cov(X, Y ) c. Find Cov(2X + 3Y, Y ).
Let X1,..., Xn be an i.i.d. sample from a geometric distribution with parameter p. U =...
Let X1,..., Xn be an i.i.d. sample from a geometric distribution with parameter p. U = ( 1, if X1 = 1, 0, if X1 > 1) find a sufficient statistic T for p. find E(U|T)
9.8 Let X and Y be independent random variables with probability distributions given by P(X =...
9.8 Let X and Y be independent random variables with probability distributions given by P(X = 0) = P(X = 1) = 1/2 and P(Y = 0) = P(Y = 2) = 1/2 . a. Compute the distribution of Z = X + Y . b. Let Y˜ and Z˜ be independent random variables, where Y˜ has the same distribution as Y , and Z˜ the same distribution as Z. Compute the distribution of X˜ = Z˜ − Y
Let X and Y be uniform random variables on [0, 1]. If X and Y are...
Let X and Y be uniform random variables on [0, 1]. If X and Y are independent, find the probability distribution function of X + Y
Let X~Geometric(p), with parameter p unknown, 0<p<1. a) Find I(p), the Fisher Information in X about...
Let X~Geometric(p), with parameter p unknown, 0<p<1. a) Find I(p), the Fisher Information in X about p. b) Suppose that pˆ is some unbiased estimator of p. Determine the Cramér-Rao Lower Bound for Var p[ ]ˆ based on one observation from this distribution. c) Show that p= I {1}(X) is an unbiased estimator of p. Does its variance achieve the Cramer-Rao Lower Bound?
Let X be distributed as a geometric with a probability of success of 0.10. Find the...
Let X be distributed as a geometric with a probability of success of 0.10. Find the probability it takes 10 or more trials to get the first success.
This is the probability distribution between two random variables X and Y: Y \ X 0...
This is the probability distribution between two random variables X and Y: Y \ X 0 1 2 3 0.1 0.2 0.2 4 0.2 0.2 0.1 a) Are those variables independent? b) What is the marginal probability of X? c) Find E[XY]
. Let X and Y be a random variables with the joint probability density function fX,Y...
. Let X and Y be a random variables with the joint probability density function fX,Y (x, y) = { 1, 0 < x, y < 1 0, otherwise } . a. Let W = max(X, Y ) Compute the probability density function of W. b. Let U = min(X, Y ) Compute the probability density function of U. c. Compute the probability density function of X + Y ..
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT