Question

In: Statistics and Probability

5. Let Y1, Y2, ...Yn i.i.d. ∼ f(y; α) = 1/6( α^8 y^3) · e ^(−α...

5. Let Y1, Y2, ...Yn i.i.d. ∼ f(y; α) = 1/6( α^8 y^3) · e ^(−α 2y) , 0 ≤ y < ∞, 0 < α < ∞.

(a) (8 points) Find an expression for the Method of Moments estimator of α, ˜α. Show all work.

(b) (8 points) Find an expression for the Maximum Likelihood estimator for α, ˆα. Show all work.

Solutions

Expert Solution


Related Solutions

Let Y1 and Y2 have joint pdf f(y1, y2) = (6(1−y2), if 0≤y1≤y2≤1 0, otherwise. a)...
Let Y1 and Y2 have joint pdf f(y1, y2) = (6(1−y2), if 0≤y1≤y2≤1 0, otherwise. a) Are Y1 and Y2 independent? Why? b) Find Cov(Y1, Y2). c) Find V(Y1−Y2). d) Find Var(Y1|Y2=y2).
Let Y = (Y1, Y2,..., Yn) and let θ > 0. Let Y|θ ∼ Pois(θ). Derive...
Let Y = (Y1, Y2,..., Yn) and let θ > 0. Let Y|θ ∼ Pois(θ). Derive the posterior density of θ given Y assuming the prior distribution of θ is Gamma(a,b) where a > 1. Then find the prior and posterior means and prior and posterior modes of θ.
Assume Y1, Y2, . . . , Yn are i.i.d. Normal (μ, σ2) where σ2 is...
Assume Y1, Y2, . . . , Yn are i.i.d. Normal (μ, σ2) where σ2 is known and fixed and the unknown mean μ has a Normal (0,σ2/m) prior, where m is a given constant. Give a 95% credible interval for μ.
If you conduct and experiment 1500 times independently, i=1,2,3,...1500. Let y1, y2.... yN be i.i.d observations...
If you conduct and experiment 1500 times independently, i=1,2,3,...1500. Let y1, y2.... yN be i.i.d observations from this experiment, yi=1 if heads with a probability of β; yi=0 if tails with a probability of 1-β. If you get 600 heads and 900 tails, what is the βMLE? A. 0.4 B. 0.5 C. 0.6 D. 0.7
Suppose that Y1 ,Y2 ,...,Yn is a random sample from distribution Uniform[0,2]. Let Y(n) and Y(1)...
Suppose that Y1 ,Y2 ,...,Yn is a random sample from distribution Uniform[0,2]. Let Y(n) and Y(1) be the order statistics. (a) Find E(Y(1)) (b) Find the density of (Y(n) − 1)2 (c) Find the density of Y(n) − Y (1)
Let Y1, Y2, . . ., Yn be a random sample from a uniform distribution on...
Let Y1, Y2, . . ., Yn be a random sample from a uniform distribution on the interval (θ - 2, θ). a) Show that Ȳ is a biased estimator of θ. Calculate the bias. b) Calculate MSE( Ȳ). c) Find an unbiased estimator of θ. d) What is the mean square error of your unbiased estimator? e) Is your unbiased estimator a consistent estimator of θ?
The joint density of Y1, Y2 is given by f(y) = k, −1 ≤ y1 ≤...
The joint density of Y1, Y2 is given by f(y) = k, −1 ≤ y1 ≤ 1, 0 ≤ y2 ≤ 1, y1 + y2 ≤ 1, y1 − y2 ≥ −1, 0, otherwise a. Find the value of k that makes this a probability density function. b. Find the probabilities P(Y2 ≤ 1/2) and P(Y1 ≥ −1/2, Y2 ≤ 1/2 c. Find the marginal distributions of Y1 and of Y2. d. Determine if Y1 and Y2 are independent e....
Let Y1, ..., Yn be a random sample with the pdf: f(y;θ)= θ(1-y)^(θ-1) with θ>0 and...
Let Y1, ..., Yn be a random sample with the pdf: f(y;θ)= θ(1-y)^(θ-1) with θ>0 and 0<y<1. i) Obtain the minimum variance unbiased estimator ( for θ)
Let Y1, Y2, ..., Yn be a random sample from an exponential distribution with mean theta....
Let Y1, Y2, ..., Yn be a random sample from an exponential distribution with mean theta. We would like to test H0: theta = 3 against Ha: theta = 5 based on this random sample. (a) Find the form of the most powerful rejection region. (b) Suppose n = 12. Find the MP rejection region of level 0.1. (c) Is the rejection region in (b) the uniformly most powerful rejection region of level 0.1 for testing H0: theta = 3...
1. . Let X1, . . . , Xn, Y1, . . . , Yn be...
1. . Let X1, . . . , Xn, Y1, . . . , Yn be mutually independent random variables, and Z = 1 n Pn i=1 XiYi . Suppose for each i ∈ {1, . . . , n}, Xi ∼ Bernoulli(p), Yi ∼ Binomial(n, p). What is Var[Z]? 2. There is a fair coin and a biased coin that flips heads with probability 1/4. You randomly pick one of the coins and flip it until you get a...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT