Question

In: Statistics and Probability

Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0<p<1....

Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0<p<1.

(a) Find P(X>Y).

(b) Find P(X+Y=n) and P(X=k|X+Y=n), for n=2,3,..., and k=1,2,...,n−1.

I need an answer asap please. Thank you.

Solutions

Expert Solution

a)

P(X > Y) + P(X < Y) +P(X = Y) = 1

since X and Y are iid with same parameter

P(X > Y)= P(X <Y)

hence

2 P(X > Y) = 1 - P(X = Y)

now

= p/(2-p)

hence

P(X > Y) = 1/2 * ( 1-   p/(2-p))

= 1/2 * (2-2p)/(2-p)

= (1-p)/(2-p)

b)

P(X = k | X+Y = n)

=( P(X = k) and P(X + Y = n)) / P(X + Y = n)

= P(X = k) * P(Y = (n-k)) / P(X + Y = n)

= p(1-p)^(k-1) * p * (1-p)^(n-k-1) / ( (n-1)(1-p)^(n-2) p^2)

= 1/(n-1)


Related Solutions

Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0...
Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0 < p < 1. (a) (6pts) Find P(X > Y ). (b) (8pts) Find P(X + Y = n) and P(X = k∣X + Y = n), for n = 2, 3, ..., and k = 1, 2, ..., n − 1
Let X and Y be independent geometric random variables with parameter p. Find the pmf of...
Let X and Y be independent geometric random variables with parameter p. Find the pmf of X + Y
Let X and Y be random variables. Suppose P(X = 0, Y = 0) = .1,...
Let X and Y be random variables. Suppose P(X = 0, Y = 0) = .1, P(X = 1, Y = 0) = .3, P(X = 2, Y = 0) = .2 P(X = 0, Y = 1) = .2, P(X = 1, Y = 1) = .2, P(X = 2, Y = 1) = 0. a. Determine E(X) and E(Y ). b. Find Cov(X, Y ) c. Find Cov(2X + 3Y, Y ).
Let X and Y be uniform random variables on [0, 1]. If X and Y are...
Let X and Y be uniform random variables on [0, 1]. If X and Y are independent, find the probability distribution function of X + Y
Let X~Geometric(p), with parameter p unknown, 0<p<1. a) Find I(p), the Fisher Information in X about...
Let X~Geometric(p), with parameter p unknown, 0<p<1. a) Find I(p), the Fisher Information in X about p. b) Suppose that pˆ is some unbiased estimator of p. Determine the Cramér-Rao Lower Bound for Var p[ ]ˆ based on one observation from this distribution. c) Show that p= I {1}(X) is an unbiased estimator of p. Does its variance achieve the Cramer-Rao Lower Bound?
Let X1,..., Xn be an i.i.d. sample from a geometric distribution with parameter p. U =...
Let X1,..., Xn be an i.i.d. sample from a geometric distribution with parameter p. U = ( 1, if X1 = 1, 0, if X1 > 1) find a sufficient statistic T for p. find E(U|T)
9.8 Let X and Y be independent random variables with probability distributions given by P(X =...
9.8 Let X and Y be independent random variables with probability distributions given by P(X = 0) = P(X = 1) = 1/2 and P(Y = 0) = P(Y = 2) = 1/2 . a. Compute the distribution of Z = X + Y . b. Let Y˜ and Z˜ be independent random variables, where Y˜ has the same distribution as Y , and Z˜ the same distribution as Z. Compute the distribution of X˜ = Z˜ − Y
Let X be distributed as a geometric with a probability of success of 0.10. Find the...
Let X be distributed as a geometric with a probability of success of 0.10. Find the probability it takes 10 or more trials to get the first success.
This is the probability distribution between two random variables X and Y: Y \ X 0...
This is the probability distribution between two random variables X and Y: Y \ X 0 1 2 3 0.1 0.2 0.2 4 0.2 0.2 0.1 a) Are those variables independent? b) What is the marginal probability of X? c) Find E[XY]
Let X and Y be independent discrete random variables with the following PDFs: x 0 1...
Let X and Y be independent discrete random variables with the following PDFs: x 0 1 2 f(x)=P[X=x] 0.5 0.3 0.2 y 0 1 2 g(y)= P[Y=y] 0.65 0.25 0.1 (a) Show work to find the PDF h(w) = P[W=w] = (f*g)(w) (the convolution) of W = X + Y (b) Show work to find E[X], E[Y] and E[W] (note that E[W] = E[X]+E[Y])
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT