Question

In: Statistics and Probability

3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and...

3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and Y1...Xn independent of each other. Assume that the common population SD σ is known but the two means are not. Consider testing the hypothesis null: μx = μy vs alternative: μx ≠ μy.

a. Find the likelihood ratio test statistic Λ. Specify which MLEs you are using and how you plug them in.


Solutions

Expert Solution


Related Solutions

3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and...
3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and Y1...Xn independent of each other. Assume that the common population SD σ is known but the two means are not. Consider testing the hypothesis null: μx = μy vs alternative: μx ≠ μy. d. Assume σ=1 and n=20. How large must δ be for the size 0.01 test to have power at least 0.99? e. Assume σ=1and δ=0.2. How large must n be for...
1. . Let X1, . . . , Xn, Y1, . . . , Yn be...
1. . Let X1, . . . , Xn, Y1, . . . , Yn be mutually independent random variables, and Z = 1 n Pn i=1 XiYi . Suppose for each i ∈ {1, . . . , n}, Xi ∼ Bernoulli(p), Yi ∼ Binomial(n, p). What is Var[Z]? 2. There is a fair coin and a biased coin that flips heads with probability 1/4. You randomly pick one of the coins and flip it until you get a...
Let X1,X2,...,Xn and Y1,Y2,...,Ym be independent random samples drawn from normal distributions with means μX and...
Let X1,X2,...,Xn and Y1,Y2,...,Ym be independent random samples drawn from normal distributions with means μX and μY , respectively, and with the same 2 known variance σ . Use the generalized likelihood ratio criterion to derive a test procedure for choosing between H0:μX =μY and H1:μX not equal to μY.
Let X1,...,Xn be iid N(μ , σ^2) with μ known and σ^2 unknown. Please carefully re-read...
Let X1,...,Xn be iid N(μ , σ^2) with μ known and σ^2 unknown. Please carefully re-read that sentence, as it is an uncommon setup! (a) Find the Cramer-Rao lower bound for an unbiased estimator of σ^2. (b) The MLE for σ^2 is ∑(xi−μ)^2 / n. Find the variance of this estimator. You may find use of the following property: E[(X−μ)^4] = 3σ^4. c) Find the efficiency of the MLE. Is it a MVUE? Explain.
Let X1, X2, . . . , Xn iid∼ N (µ, σ2 ). Consider the hypotheses...
Let X1, X2, . . . , Xn iid∼ N (µ, σ2 ). Consider the hypotheses H0 : µ = µ0 and H1 : µ (not equal)= µ0 and the test statistic (X bar − µ0)/ (S/√ n). Note that S has been used as σ is unknown. a. What is the distribution of the test statistic when H0 is true? b. What is the type I error of an α−level test of this type? Prove it. c. What is...
Let X1, . . . , Xn ∼ iid N(θ, σ2 ), with one-sided hypotheses H0...
Let X1, . . . , Xn ∼ iid N(θ, σ2 ), with one-sided hypotheses H0 : θ ≤ θ0 vs H1 : θ > θ0. (a) If σ^2 is known, we can use the UMP size-α test. Find the formula for the P-value of this test.
Let X1, X2, · · · , Xn be iid samples from density: f(x) = {θx^(θ−1),...
Let X1, X2, · · · , Xn be iid samples from density: f(x) = {θx^(θ−1), if 0 ≤ x ≤ 1} 0 otherwise Find the maximum likelihood estimate for θ. Explain, using Strong Law of Large Numbers, that this maximum likelihood estimate is consistent.
) Let X1, . . . , Xn be iid from the distribution with parameter η...
) Let X1, . . . , Xn be iid from the distribution with parameter η and probability density function: f(x; η) = e ^(−x+η) , x > η, and zero otherwise. 1. Find the MLE of η. 2. Show that X_1:n is sufficient and complete for η. 3. Find the UMVUE of η.
Let X1,…, Xn be a sample of iid N(0, ?) random variables with Θ=(0, ∞). Determine...
Let X1,…, Xn be a sample of iid N(0, ?) random variables with Θ=(0, ∞). Determine a) the MLE ? of ?. b) E(? ̂). c) the asymptotic variance of the MLE of ?. d) the MLE of SD(Xi ) = √ ?.
Consider a random sample (X1, Y1), (X2, Y2), . . . , (Xn, Yn) where Y...
Consider a random sample (X1, Y1), (X2, Y2), . . . , (Xn, Yn) where Y | X = x is modeled by Y=β0+β1x+ε, ε∼N(0,σ^2), where β0,β1and σ^2 are unknown. Let β1 denote the mle of β1. Derive V(βhat1).
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT