Question

In: Statistics and Probability

Let Y = (Y1, Y2,..., Yn) and let θ > 0. Let Y|θ ∼ Pois(θ). Derive...

Let Y = (Y1, Y2,..., Yn) and let θ > 0. Let Y|θ ∼ Pois(θ). Derive the posterior density of θ given Y assuming the prior distribution of θ is Gamma(a,b) where a > 1. Then find the prior and posterior means and prior and posterior modes of θ.

Solutions

Expert Solution


Related Solutions

Let Y1, ..., Yn be a random sample with the pdf: f(y;θ)= θ(1-y)^(θ-1) with θ>0 and...
Let Y1, ..., Yn be a random sample with the pdf: f(y;θ)= θ(1-y)^(θ-1) with θ>0 and 0<y<1. i) Obtain the minimum variance unbiased estimator ( for θ)
Let Y1 and Y2 have joint pdf f(y1, y2) = (6(1−y2), if 0≤y1≤y2≤1 0, otherwise. a)...
Let Y1 and Y2 have joint pdf f(y1, y2) = (6(1−y2), if 0≤y1≤y2≤1 0, otherwise. a) Are Y1 and Y2 independent? Why? b) Find Cov(Y1, Y2). c) Find V(Y1−Y2). d) Find Var(Y1|Y2=y2).
Let Y1, Y2, . . ., Yn be a random sample from a uniform distribution on...
Let Y1, Y2, . . ., Yn be a random sample from a uniform distribution on the interval (θ - 2, θ). a) Show that Ȳ is a biased estimator of θ. Calculate the bias. b) Calculate MSE( Ȳ). c) Find an unbiased estimator of θ. d) What is the mean square error of your unbiased estimator? e) Is your unbiased estimator a consistent estimator of θ?
Suppose that Y1 ,Y2 ,...,Yn is a random sample from distribution Uniform[0,2]. Let Y(n) and Y(1)...
Suppose that Y1 ,Y2 ,...,Yn is a random sample from distribution Uniform[0,2]. Let Y(n) and Y(1) be the order statistics. (a) Find E(Y(1)) (b) Find the density of (Y(n) − 1)2 (c) Find the density of Y(n) − Y (1)
Let Y1,...,Yn be a sample from the Uniform density on [0,2θ]. Show that θ = max(Y1,...
Let Y1,...,Yn be a sample from the Uniform density on [0,2θ]. Show that θ = max(Y1, . . . , Yn) is a sufficient statistic for θ. Find a MVUE (Minimal Variance Unbiased Estimator) for θ.
5. Let Y1, Y2, ...Yn i.i.d. ∼ f(y; α) = 1/6( α^8 y^3) · e ^(−α...
5. Let Y1, Y2, ...Yn i.i.d. ∼ f(y; α) = 1/6( α^8 y^3) · e ^(−α 2y) , 0 ≤ y < ∞, 0 < α < ∞. (a) (8 points) Find an expression for the Method of Moments estimator of α, ˜α. Show all work. (b) (8 points) Find an expression for the Maximum Likelihood estimator for α, ˆα. Show all work.
Let Y1, Y2, ..., Yn be a random sample from an exponential distribution with mean theta....
Let Y1, Y2, ..., Yn be a random sample from an exponential distribution with mean theta. We would like to test H0: theta = 3 against Ha: theta = 5 based on this random sample. (a) Find the form of the most powerful rejection region. (b) Suppose n = 12. Find the MP rejection region of level 0.1. (c) Is the rejection region in (b) the uniformly most powerful rejection region of level 0.1 for testing H0: theta = 3...
Construct a two-sided (1−α)-CI for θ in Uniform[0,θ] based on IID Y1,...,Yn
Construct a two-sided (1−α)-CI for θ in Uniform[0,θ] based on IID Y1,...,Yn
Consider a random sample (X1, Y1), (X2, Y2), . . . , (Xn, Yn) where Y...
Consider a random sample (X1, Y1), (X2, Y2), . . . , (Xn, Yn) where Y | X = x is modeled by Y=β0+β1x+ε, ε∼N(0,σ^2), where β0,β1and σ^2 are unknown. Let β1 denote the mle of β1. Derive V(βhat1).
1. . Let X1, . . . , Xn, Y1, . . . , Yn be...
1. . Let X1, . . . , Xn, Y1, . . . , Yn be mutually independent random variables, and Z = 1 n Pn i=1 XiYi . Suppose for each i ∈ {1, . . . , n}, Xi ∼ Bernoulli(p), Yi ∼ Binomial(n, p). What is Var[Z]? 2. There is a fair coin and a biased coin that flips heads with probability 1/4. You randomly pick one of the coins and flip it until you get a...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT