Question

In: Statistics and Probability

2. Let X ~ Geometric (p) where 0 < p <1 a. Show explicitly that this...

2. Let X ~ Geometric (p) where 0 < p <1

a. Show explicitly that this family is “very regular,” that is, that R0,R1,R2,R3,R4 hold.

R 0 - different parameter values have different functions.

R 1 - parameter space does not contain its own endpoints.

R 2. - the set of points x where f (x, p) is not zero and should not depend on p.

R 3. One derivative can be found with respect to p.

R 4. Two derivatives can be found with respect to p.

b. Find the maximum likelihood estimator of p, call it Yn for this problem.

c. Is Yn unbiased? Explain.

d. Show that Yn is consistent asymptotically normal and identify the asymptotic normal variance.

e. Variance-stabilize your result in (d) or show there is no need to do so.

f. Compute I (p) where I is Fisher’s Information.

g. Compute the efficiency of Yn for p (or show that you should not!).

Solutions

Expert Solution

It is given that X ~ Geometric (p) where 0 < p <1

the defination of geometric distribution states;

"a random variable X is said to have geometric distribution if it assumes only non-negative values and its probability mass function is given by;

P(X=x)= qxp ; x=0,1,2,3......... ; 0<p<1 ; q=1-p "

R 0 - according to its probability mass function, different values of p will give a different function.

R 1 - parameter space does not contain its own endpoints.

i.e.= for p=0;   P(X=x)= (1-0)x.0= 0

for p=1; P(X=x)= (1-1)x.1= 0

first derivative with respect to p=

  

second derivative with respect to p=

MLE of p=

f(X)= qxp

liklihood function can be written as

by taking log both the sides

by differentiating with respect to p

by putting equal to zero

this is MLE of p.

again differentiating with respect to p

by taking negative expectation of it, it is equal to Fisher's information=


Related Solutions

Let X ~ Geometric (p) where 0 < p <1 a) Show explicitly that this family...
Let X ~ Geometric (p) where 0 < p <1 a) Show explicitly that this family is “very regular,” that is, that R0,R1,R2,R3,R4 hold. R 0 - different parameter values have different functions. R 1 - parameter space does not contain its own endpoints. R 2. - the set of points x where f (x, p) is not zero and should not depend on p. R 3. One derivative can be found with respect to p. R 4. Two derivatives...
Please answer all parts if possible. Let X ~ Geometric (p) where 0 < p <1...
Please answer all parts if possible. Let X ~ Geometric (p) where 0 < p <1 a) Show explicitly that this family is “very regular,” that is, that R0,R1,R2,R3,R4 hold. R 0 - different parameter values have different functions. R 1 - parameter space does not contain its own endpoints. R 2. - the set of points x where f (x, p) is not zero and should not depend on p. R 3. One derivative can be found with respect...
2. Let X ~ Pois (λ) λ > 0 a. Show explicitly that this family is...
2. Let X ~ Pois (λ) λ > 0 a. Show explicitly that this family is “very regular,” that is, that R0,R1,R2,R3,R4 hold. R 0 - different parameter values have different functions. R1 - parameter space does not contain its own endpoints. R 2. - the set of points x where f (x, λ) is not zero and should not depend on λ . R 3. One derivative can be found with respect to λ. R 4. Two derivatives can...
A geometric distribution has a pdf given by P(X=x) = p(1-p)^x, where x = 0, 1,...
A geometric distribution has a pdf given by P(X=x) = p(1-p)^x, where x = 0, 1, 2, ..., and 0 < p < 1. This form of the geometric starts at x = 0, not at x = 1. Given are the following properties: E(X) = (1-p)/p, and Var(X) = (1-p)/p^2 A random sample of size n is drawn; the data are X1, X2, ..., Xn. A. Derive the Fisher information function for the parameter p. B. Find the Cramér-Rao...
Let X~Geometric(p), with parameter p unknown, 0<p<1. a) Find I(p), the Fisher Information in X about...
Let X~Geometric(p), with parameter p unknown, 0<p<1. a) Find I(p), the Fisher Information in X about p. b) Suppose that pˆ is some unbiased estimator of p. Determine the Cramér-Rao Lower Bound for Var p[ ]ˆ based on one observation from this distribution. c) Show that p= I {1}(X) is an unbiased estimator of p. Does its variance achieve the Cramer-Rao Lower Bound?
Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0<p<1....
Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0<p<1. (a) Find P(X>Y). (b) Find P(X+Y=n) and P(X=k|X+Y=n), for n=2,3,..., and k=1,2,...,n−1. I need an answer asap please. Thank you.
Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0...
Let X and Y be i.i.d. geometric random variables with parameter (probability of success) p, 0 < p < 1. (a) (6pts) Find P(X > Y ). (b) (8pts) Find P(X + Y = n) and P(X = k∣X + Y = n), for n = 2, 3, ..., and k = 1, 2, ..., n − 1
1. Let X be random variable with density p(x) = x/2 for 0 < x<...
1. Let X be random variable with density p(x) = x/2 for 0 < x < 2 and 0 otherwise. Let Y = X^2−2. a) Compute the CDF and pdf of Y. b) Compute P(Y >0 | X ≤ 1.8).
Let V = {P(x) ∈ P10(R) : P'(−4) = 0 and P''(2) = 0}. If V=...
Let V = {P(x) ∈ P10(R) : P'(−4) = 0 and P''(2) = 0}. If V= M3×n(R), find n.
11.4 Let p be a prime. Let S = ℤ/p - {0} = {[1]p, [2]p, ....
11.4 Let p be a prime. Let S = ℤ/p - {0} = {[1]p, [2]p, . . . , [p-1]p}. Prove that for y ≠ 0, Ly restricts to a bijective map Ly|s : S → S. 11.5 Prove Fermat's Little Theorem
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT