Question

In: Statistics and Probability

Let Y1,...,Yn have common pmf, p(y)= 2p if y = -1, 1/2 - p if y...

Let Y1,...,Yn have common pmf, p(y)= 2p if y = -1, 1/2 - p if y = 0, 1/2 - p if y = 1.
a.) Find a sufficient statistics for p.
b.) Find the MOM estimator of p. Is it unbiased?
c.) Find the MLE estimator of p. Is it unbiased?
d.) Find the MSE of both estimators. Using MSE as a criterion, which estimator is better?

Solutions

Expert Solution


Related Solutions

1. . Let X1, . . . , Xn, Y1, . . . , Yn be...
1. . Let X1, . . . , Xn, Y1, . . . , Yn be mutually independent random variables, and Z = 1 n Pn i=1 XiYi . Suppose for each i ∈ {1, . . . , n}, Xi ∼ Bernoulli(p), Yi ∼ Binomial(n, p). What is Var[Z]? 2. There is a fair coin and a biased coin that flips heads with probability 1/4. You randomly pick one of the coins and flip it until you get a...
Let Y1,Y2,...,Yn be a Bernoulli distributed random sample with P(Yi = 1) = p and P(Yi...
Let Y1,Y2,...,Yn be a Bernoulli distributed random sample with P(Yi = 1) = p and P(Yi = 0) = 1−p for all i. (a) Prove that E(¯ Y ) = p and V (¯ Y ) = p(1−p)/n2, for the sample mean ¯ Y of Y1,Y2,...,Yn, and find a sufficient statistic U for p and show it is sufficient for p. (b) Find MVUE for p and show it is unbiased for p.
Let Y = (Y1, Y2,..., Yn) and let θ > 0. Let Y|θ ∼ Pois(θ). Derive...
Let Y = (Y1, Y2,..., Yn) and let θ > 0. Let Y|θ ∼ Pois(θ). Derive the posterior density of θ given Y assuming the prior distribution of θ is Gamma(a,b) where a > 1. Then find the prior and posterior means and prior and posterior modes of θ.
let p ( x , y )be a joint pmf of Xand Y. p ( x...
let p ( x , y )be a joint pmf of Xand Y. p ( x , y ) y = 0 y = 1 y = 2 x = 0 .12 .10 .08 x = 1 .13 .17 .10 x = 2 .15 .15 0 (a) Find the marginal pmf's of Xand Y. (b) Determine whether Xand Yare independent. c) Find Correlation (X,Y)
Suppose Y1, . . . , Yn are independent random variables with common density fY(y) =...
Suppose Y1, . . . , Yn are independent random variables with common density fY(y) = eμ−y y > μ. Derive a 95% confidence interval for μ. Find the MLE for μ to use as the estimator
Let Y1, ..., Yn be a random sample with the pdf: f(y;θ)= θ(1-y)^(θ-1) with θ>0 and...
Let Y1, ..., Yn be a random sample with the pdf: f(y;θ)= θ(1-y)^(θ-1) with θ>0 and 0<y<1. i) Obtain the minimum variance unbiased estimator ( for θ)
Suppose that Y1 ,Y2 ,...,Yn is a random sample from distribution Uniform[0,2]. Let Y(n) and Y(1)...
Suppose that Y1 ,Y2 ,...,Yn is a random sample from distribution Uniform[0,2]. Let Y(n) and Y(1) be the order statistics. (a) Find E(Y(1)) (b) Find the density of (Y(n) − 1)2 (c) Find the density of Y(n) − Y (1)
5. Let Y1, Y2, ...Yn i.i.d. ∼ f(y; α) = 1/6( α^8 y^3) · e ^(−α...
5. Let Y1, Y2, ...Yn i.i.d. ∼ f(y; α) = 1/6( α^8 y^3) · e ^(−α 2y) , 0 ≤ y < ∞, 0 < α < ∞. (a) (8 points) Find an expression for the Method of Moments estimator of α, ˜α. Show all work. (b) (8 points) Find an expression for the Maximum Likelihood estimator for α, ˆα. Show all work.
Consider the family of distributions with pmf pX(x) = p if x = −1, 2p if...
Consider the family of distributions with pmf pX(x) = p if x = −1, 2p if x = 0, 1 − 3p if x = 1 . Here p is an unknown parameter, and 0 ≤ p ≤ 1/3. Let X1, X2, . . . , Xn be iid with common pmf a member of this family. (i) Find the MOM estimator of p. (ii) Find the MLE estimator of p. (Consider the statistics A = the number of i...
Let Y1,...,Yn be a sample from the Uniform density on [0,2θ]. Show that θ = max(Y1,...
Let Y1,...,Yn be a sample from the Uniform density on [0,2θ]. Show that θ = max(Y1, . . . , Yn) is a sufficient statistic for θ. Find a MVUE (Minimal Variance Unbiased Estimator) for θ.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT