Question

In: Statistics and Probability

Let ?1,?2,…,??be independent, identically distributed random variables with p.d.f. ?(?) = ??^?−1,0 ≤ ? ≤ 1...

Let ?1,?2,…,??be independent, identically distributed random variables with p.d.f. ?(?) = ??^?−1,0 ≤ ? ≤ 1 .

c. Show that the maximum likelihood estimator for ? is biased, and find a function of the mle that is unbiased. (Hint: Show that the random variable −ln (??) is exponential, the sum of exponentials is Gamma, and the mean of 1/X for a gamma with parameters ? and ? is 1 / (?(? − 1)).

d. Is the estimator you found in part c. a minimum variance unbiased estimator?

Solutions

Expert Solution


Related Solutions

Let ?1 , ?2 , ... , ?? be independent, identically distributed random variables with p.d.f....
Let ?1 , ?2 , ... , ?? be independent, identically distributed random variables with p.d.f. ?(?) = ???−1, 0 ≤ ? ≤ 1 . c) Show that the maximum likelihood estimator for ? is biased, and find a function of the mle that is unbiased. (Hint: Show that the random variable −ln (??) is exponential, the sum of exponentials is Gamma, and the mean of 1/X for a gamma with parameters ? and ? is 1⁄(?(? − 1)).) d)...
Let X1, X2, . . . be a sequence of independent and identically distributed random variables...
Let X1, X2, . . . be a sequence of independent and identically distributed random variables where the distribution is given by the so-called zero-truncated Poisson distribution with probability mass function; P(X = x) = λx/ (x!(eλ − 1)), x = 1, 2, 3... Let N ∼ Binomial(n, 1−e^−λ ) be another random variable that is independent of the Xi ’s. 1) Show that Y = X1 +X2 + ... + XN has a Poisson distribution with mean nλ.
Let Xi's be independent and identically distributed Poisson random variables for 1 ≤ i ≤ n....
Let Xi's be independent and identically distributed Poisson random variables for 1 ≤ i ≤ n. Derive the distribution for the summation of Xi from 1 to n. (Without using MGF)
Let X and Y be two independent and identically distributed random variables with expected value 1...
Let X and Y be two independent and identically distributed random variables with expected value 1 and variance 2.56. (i) Find a non-trivial upper bound for P(|X + Y − 2| ≥ 1). 5 MARKS (ii) Now suppose that X and Y are independent and identically distributed N(1, 2.56) random variables. What is P(|X + Y − 2| ≥ 1) exactly? Briefly, state your reasoning. 2 MARKS (iii) Why is the upper bound you obtained in Part (i) so different...
Let Y1, Y2, Y3, and Y4be independent, identically distributed random variables from a population with a...
Let Y1, Y2, Y3, and Y4be independent, identically distributed random variables from a population with a mean μ and a variance σ2.  Consider a different estimator of μ: W =  Y1+  Y2+ Y3+ Y4. Let Y1, Y2, Y3, and Y4be independent, identically distributed random variables from a population with a mean μ and a variance σ2.  Consider a different estimator of μ: W = 1/8 Y1+ 1/3 Y2+ 1/6 Y3+ 3/8 Y4. This is an example of a weighted average of the Yi. Show...
Let X1, X2,..., Xnbe independent and identically distributed exponential random variables with parameter λ . a)...
Let X1, X2,..., Xnbe independent and identically distributed exponential random variables with parameter λ . a) Compute P{max(X1, X2,..., Xn) ≤ x} and find the pdf of Y = max(X1, X2,..., Xn). b) Compute P{min(X1, X2,..., Xn) ≤ x} and find the pdf of Z = min(X1, X2,..., Xn). c) Compute E(Y) and E(Z).
Let X1 and X2 be independent identically distributed random variables with pmf p(0) = 1/4, p(1)...
Let X1 and X2 be independent identically distributed random variables with pmf p(0) = 1/4, p(1) = 1/2, p(2) = 1/4 (a) What is the probability mass function (pmf) of X1 + X2? (b) What is the probability mass function (pmf) of X(2) = max{X1, X2}? (c) What is the MGF of X1? (d) What is the MGF of X1 + X2
Let X1 and X2 be independent identically distributed random variables with pmf p(0) = 1/4, p(1)...
Let X1 and X2 be independent identically distributed random variables with pmf p(0) = 1/4, p(1) = 1/2, p(2) = 1/4 (a) What is the probability mass function (pmf) of X1 + X2? (b) What is the probability mass function (pmf) of X(2) = max{X1, X2}? (c) What is the MGF of X1? (d) What is the MGF of X1 + X2? (Note: The formulas we did were for the continuous case, so they don’t directly apply here, but you...
A: Suppose two random variables X and Y are independent and identically distributed as standard normal....
A: Suppose two random variables X and Y are independent and identically distributed as standard normal. Specify the joint probability density function f(x, y) of X and Y. Next, suppose two random variables X and Y are independent and identically distributed as Bernoulli with parameter 1 2 . Specify the joint probability mass function f(x, y) of X and Y. B: Consider a time series realization X = [10, 15, 23, 20, 19] with a length of five-periods. Compute the...
2. Let X1, X2, . . . , Xn be independent, uniformly distributed random variables on...
2. Let X1, X2, . . . , Xn be independent, uniformly distributed random variables on the interval [0, θ]. (a) Find the pdf of X(j) , the j th order statistic. (b) Use the result from (a) to find E(X(j)). (c) Use the result from (b) to find E(X(j)−X(j−1)), the mean difference between two successive order statistics. (d) Suppose that n = 10, and X1, . . . , X10 represents the waiting times that the n = 10...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT