Let X1,…, Xn be a sample of iid Exp(?1, ?2) random variables
with common pdf f (x; ?1, ?2) = 1/ ?1 e^ −( x−?2)/ ?1 for x > ?2
and Θ = ℝ × ℝ+. a) Show that S = (X(1), ∑n i=1 Xi ) is jointly
sufficient for (?1, ?2).
b) Determine the pdf of X(1).
c) Determine E[X(1)].
d) Determine E[X^2 (1) ].
e) Determine Var[X(1)].
f ) Is X(1) an MSE-consistent estimator of ?2?
g) Given...
Let X1, ..., Xn be iid with pdf f(x; θ) = (1/ x√ 2π)
e(-(logx- theta)^2) /2 Ix>0 for θ ∈ R.
(a) (15 points) Find the MLE of θ.
(b) (10 points) If we are testing H0 : θ = 0 vs Ha :
θ != 0. Provide a formula for the likelihood ratio test statistic
λ(X).
(c) (5 points) Denote the MLE as ˆθ. Show that λ(X) is can be
written as a decreasing function of | ˆθ|...
Let X1,…, Xn be a sample of iid N(0, ?) random
variables with Θ=(0, ∞). Determine
a) the MLE ? of ?.
b) E(? ̂).
c) the asymptotic variance of the MLE of ?.
d) the MLE of SD(Xi ) = √ ?.
Let X = ( X1, X2, X3, ,,,, Xn ) is iid,
f(x, a, b) = 1/ab * (x/a)^{(1-b)/b} 0 <= x <= a ,,,,, b
< 1
then, find a two dimensional sufficient statistic for (a, b)
Let X1, X2, . . . , Xn be iid Poisson random variables
with unknown mean µ
1. Find the maximum likelihood estimator of µ
2.Determine whether the maximum likelihood estimator is unbiased
for µ
Let X1,...,Xn be independent random
variables,and let X=X1+...+Xn be their
sum.
1. Suppose that each Xi is geometric with respective
parameter pi. It is known that the mean of X is equal to
μ, where μ > 0. Show that the variance of X is minimized if the
pi's are all equal to n/μ.
2. Suppose that each Xi is Bernoulli with respective
parameter pi. It is known that the mean of X is equal to
μ, where μ >...
Let ?1, ?2,…. . , ?? (n random variables iid) as a
variable whose pdf is continuous and uniform over the interval [? -
1; ? + 3].
(1) Determine the estimator of the moments method.
(2) Is this estimator unbiased? What is its variance?
(3) Find the maximum likelihood estimator (VME) for this
setting. Is it unique?
Let X1, X2, · · · , Xn be iid samples from density:
f(x) = {θx^(θ−1), if 0 ≤ x ≤ 1}
0 otherwise
Find the maximum likelihood estimate for θ. Explain, using
Strong Law of Large Numbers, that this maximum likelihood estimate
is consistent.
Let X1, ..., Xn be i.i.d random variables with the density
function f(x|θ) = e^(θ−x) , θ ≤ x. a. Find the Method of Moment
estimate of θ b. The MLE of θ (Hint: Think carefully before taking
derivative, do we have to take derivative?)