Question

In: Statistics and Probability

Let X1,X2,...,Xn and Y1,Y2,...,Ym be independent random samples drawn from normal distributions with means μX and...

Let X1,X2,...,Xn and Y1,Y2,...,Ym be independent random samples drawn from normal distributions with means μX and μY , respectively, and with the same 2 known variance σ . Use the generalized likelihood ratio criterion to derive a test procedure for choosing between H0:μX =μY and H1:μX not equal to μY.

Solutions

Expert Solution


Related Solutions

Suppose that X1, X2, , Xm and Y1, Y2, , Yn are independent random samples, with...
Suppose that X1, X2, , Xm and Y1, Y2, , Yn are independent random samples, with the variables Xi normally distributed with mean μ1 and variance σ12 and the variables Yi normally distributed with mean μ2 and variance σ22. The difference between the sample means, X − Y, is then a linear combination of m + n normally distributed random variables and, by this theorem, is itself normally distributed. (a) Find E(X − Y). (b) Find V(X − Y). (c)...
: Let X1, X2, . . . , Xn be a random sample from the normal...
: Let X1, X2, . . . , Xn be a random sample from the normal distribution N(µ, 25). To test the hypothesis H0 : µ = 40 against H1 : µne40, let us define the three critical regions: C1 = {x¯ : ¯x ≥ c1}, C2 = {x¯ : ¯x ≤ c2}, and C3 = {x¯ : |x¯ − 40| ≥ c3}. (a) If n = 12, find the values of c1, c2, c3 such that the size of...
. Let X1, X2, . . . , Xn be a random sample from a normal...
. Let X1, X2, . . . , Xn be a random sample from a normal population with mean zero but unknown variance σ 2 . (a) Find a minimum-variance unbiased estimator (MVUE) of σ 2 . Explain why this is a MVUE. (b) Find the distribution and the variance of the MVUE of σ 2 and prove the consistency of this estimator. (c) Give a formula of a 100(1 − α)% confidence interval for σ 2 constructed using the...
Consider a random sample (X1, Y1), (X2, Y2), . . . , (Xn, Yn) where Y...
Consider a random sample (X1, Y1), (X2, Y2), . . . , (Xn, Yn) where Y | X = x is modeled by Y=β0+β1x+ε, ε∼N(0,σ^2), where β0,β1and σ^2 are unknown. Let β1 denote the mle of β1. Derive V(βhat1).
3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and...
3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and Y1...Xn independent of each other. Assume that the common population SD σ is known but the two means are not. Consider testing the hypothesis null: μx = μy vs alternative: μx ≠ μy. d. Assume σ=1 and n=20. How large must δ be for the size 0.01 test to have power at least 0.99? e. Assume σ=1and δ=0.2. How large must n be for...
3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and...
3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and Y1...Xn independent of each other. Assume that the common population SD σ is known but the two means are not. Consider testing the hypothesis null: μx = μy vs alternative: μx ≠ μy. a. Find the likelihood ratio test statistic Λ. Specify which MLEs you are using and how you plug them in.
Let X1, X2, … , Xn be a random sample from a normal population with zero...
Let X1, X2, … , Xn be a random sample from a normal population with zero mean and unknown variance σ^2 . Find the maximum likelihood estimator of σ^2
1. Let ρ: R2 ×R2 →R be given by ρ((x1,y1),(x2,y2)) = |x1 −x2|+|y1 −y2|. (a) Prove...
1. Let ρ: R2 ×R2 →R be given by ρ((x1,y1),(x2,y2)) = |x1 −x2|+|y1 −y2|. (a) Prove that (R2,ρ) is a metric space. (b) In (R2,ρ), sketch the open ball with center (0,0) and radius 1. 2. Let {xn} be a sequence in a metric space (X,ρ). Prove that if xn → a and xn → b for some a,b ∈ X, then a = b. 3. (Optional) Let (C[a,b],ρ) be the metric space discussed in example 10.6 on page 344...
Let X1,X2,X3 be i.i.d. N(0,1) random variables. Suppose Y1 = X1 + X2 + X3, Y2...
Let X1,X2,X3 be i.i.d. N(0,1) random variables. Suppose Y1 = X1 + X2 + X3, Y2 = X1 −X2, Y3 =X1 −X3. Find the joint pdf of Y = (Y1,Y2,Y3)′ using : Multivariate normal distribution properties.
2. Let X1, X2, . . . , Xn be independent, uniformly distributed random variables on...
2. Let X1, X2, . . . , Xn be independent, uniformly distributed random variables on the interval [0, θ]. (a) Find the pdf of X(j) , the j th order statistic. (b) Use the result from (a) to find E(X(j)). (c) Use the result from (b) to find E(X(j)−X(j−1)), the mean difference between two successive order statistics. (d) Suppose that n = 10, and X1, . . . , X10 represents the waiting times that the n = 10...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT