Let X1,..., Xn be an i.i.d. sample from a geometric distribution
with parameter p.
U =...
Let X1,..., Xn be an i.i.d. sample from a geometric distribution
with parameter p.
U = ( 1, if X1 = 1, 0, if X1 > 1)
find a sufficient statistic T for p.
find E(U|T)
Solutions
Expert Solution
** Note that here the
U(x) as defined it is usually different but in generally it is as
defined above . Then we can get a complete result but here it is
not . for which the term (1-p) remains here .
) Let X1, . . . , Xn be iid from the distribution with parameter
η and probability density function: f(x; η) = e ^(−x+η) , x > η,
and zero otherwise. 1. Find the MLE of η. 2. Show that X_1:n is
sufficient and complete for η. 3. Find the UMVUE of η.
Let X1,...,Xn be a random sample from a gamma distribution with
shape parameter α and rate β (note that this may be a different
gamma specification than you are used to). Then
f(x | α, β) = (βα/Γ(α))*x^(α−1) * e^(−βx). where x, α, β >
0
(a) Derive the equations that yield the maximum likelihood
estimators of α and β. Can they be solved explicitly? Hint: don’t
forget your maximum checks, and it may help to do some internet
searching...
2. Let X1, . . . , Xn be a random sample from the distribution
with pdf given by fX(x;β) = β 1(x ≥ 1).
xβ+1
(a) Show that T = ni=1 log Xi is a sufficient statistic for β.
Hint: Use
n1n1n=exp log=exp −logxi .i=1 xi i=1 xi i=1
(b) Find the pdf of Y = logX, where X ∼ fX(x;β).
(c) Find the distribution of T . Hint: Identify the distribution of
Y and use mgfs.
(d) Find...
Let X1, . . . , Xn be a random sample from a uniform
distribution on the interval [a, b]
(i) Find the moments estimators of a and b.
(ii) Find the MLEs of a and b.
Let X1,X2, . . . , Xn be a random sample from the uniform
distribution with pdf f(x; θ1, θ2) =
1/(2θ2), θ1 − θ2 < x <
θ1 + θ2, where −∞ < θ1 < ∞
and θ2 > 0, and the pdf is equal to zero
elsewhere.
(a) Show that Y1 = min(Xi) and Yn = max(Xi), the joint
sufficient statistics for θ1 and θ2, are
complete.
(b) Find the MVUEs of θ1 and θ2.
Let X1,...,Xn be i.i.d. N(θ,1), where θ ∈ R is the
unknown parameter.
(a) Find an unbiased estimator of θ^2 based on
(Xn)^2.
(b) Calculate it’s variance and compare it with the Cram
́er-Rao lower bound.
Suppose X1, X2, ..., Xn is a random sample from a Poisson
distribution with unknown parameter µ.
a. What is the mean and variance of this distribution?
b. Is X1 + 2X6 − X8 an estimator of µ? Is it a good estimator?
Why or why not?
c. Find the moment estimator and MLE of µ.
d. Show the estimators in (c) are unbiased.
e. Find the MSE of the estimators in (c).
Given the frequency table below:
X 0...
Let X1, . . . , Xn be i.i.d. samples from Uniform(0, θ). Show
that for any α ∈ (0, 1), there is a cn,α, such that
[max(X1,...,Xn),cn,α max(X1,...,Xn)] is a 1−α confidence interval
of θ.
R simulation:
Let X1, . . . , Xn be i.i.d. random variables from a uniform
distribution on [0, 2]. Generate
and plot 10 paths of sample means from n = 1 to n = 40 in one
figure for each case. Give
some comments to empirically check the Law of Large Numbers.
(a) When n is large,
X1 + · · · + Xn/n converges to E[Xi].
(b) When n is large,
X1^2+ · · · + Xn^2/n converges to...
Let X and Y be i.i.d. geometric random variables with parameter
(probability of success) p, 0 < p < 1. (a) (6pts) Find P(X
> Y ). (b) (8pts) Find P(X + Y = n) and P(X = k∣X + Y = n), for
n = 2, 3, ..., and k = 1, 2, ..., n − 1