Question

In: Statistics and Probability

(a) Suppose X1, ..., Xn1 i.i.d. ∼ N(µ1, σ2 1 ) and Y1, ...Yn2 i.i.d. ∼...

(a) Suppose X1, ..., Xn1 i.i.d. ∼ N(µ1, σ2 1 ) and Y1, ...Yn2 i.i.d. ∼ N(µ2, σ2 2 ) are independent Normal samples. Suggest an unbiased estimator for µ1 − µ2 and find its’ standard error. Now Suppose n1 = 100, n2 = 200, it is known that σ 2 1 = σ 2 2 = 1, and we calculate X¯ = 5.7, Y¯ = 5.2. Find a 2-standard-error bound on the error of estimation. (b) Suppose X ∼ Binomial(n1, p1) and Y ∼ Binomial(n2, p2) are independent Binomial random variables. Suggest an unbiased estimator for p1 − p2 and find its’ standard error.

Solutions

Expert Solution

We observe that X1, X2 , ..........Xn1 be a random sample of size n1 is drawn from the normal population with mean 1 and standard deviation 21

and Y1,Y2,........Yn2 be another independent sample another normal population with mean 2 and standard deviation 22

mean of first sample =Xi//n1 and mean of second sample y- = Yi/n2

Unbiased estimator of 1- 2

(1 - 2 is an unbised estimator of 1- 2

since E( - y-- ) = E() - E(y-- ) =  1- 2

Standard error of [ - y--] = S.E [ ( - y- ] = V[( - y--] = [212/n1 +[222/n1]

now suppose n1 =100, n2= 200 = 5.7 y-- =5.2 and 12 =22= 1

Standard error of difference of sample means =  [212/n1 +[222/n1] = [1/100 + 1/200 ] =0.12247

1standard error bound is (x-- y-) S.E ( x-- y-)

0.5 0.12247

(0.37753 ,,0.62247)

2 standard error bound is (x-- y-) 2S.E ( x-- y-)

0.5 2x0.12247

(0.255 , 0.745 ) similarly calculate 3 standard error bound

2. if X be a variable which follows binomial distribution with parameters n1, P1

and Y be another independent variate follows binomial distribution with parameters n2 , P2

(p1 -p2) is an unbiased estimator of (P1-P2)

Since E (p1 -p2) = E(X/n1 - Y/n2) = E(X)/n1 - E(Y) /n2 = n1 P1 /n1 - n2P2 /n2 = P1- P2

and standard error of  (p1 -p2) = S.E ( p1 - p2 ) = V(p1) + V(p2) = P1Q1/n1 + P2Q2/n2

where Q1 =1-P1 and Q2 = 1-P2

V(p1) = V(X1) / n21 = n1P1Q1 / n21 = P1Q1/n1

similarly V( p2)= P2Q2/n2


Related Solutions

Let X1, X2, · · · , Xn (n ≥ 30) be i.i.d observations from N(µ1,...
Let X1, X2, · · · , Xn (n ≥ 30) be i.i.d observations from N(µ1, σ12 ) and Y1, Y2, · · · , Yn be i.i.d observations from N(µ2, σ22 ). Also assume that X's and Y's are independent. Suppose that µ1, µ2, σ12 , σ22  are unknown. Find an approximate 95% confidence interval for µ1µ2.
Let X1,X2,X3 be i.i.d. N(0,1) random variables. Suppose Y1 = X1 + X2 + X3, Y2...
Let X1,X2,X3 be i.i.d. N(0,1) random variables. Suppose Y1 = X1 + X2 + X3, Y2 = X1 −X2, Y3 =X1 −X3. Find the joint pdf of Y = (Y1,Y2,Y3)′ using : Multivariate normal distribution properties.
Assume Y1, Y2, . . . , Yn are i.i.d. Normal (μ, σ2) where σ2 is...
Assume Y1, Y2, . . . , Yn are i.i.d. Normal (μ, σ2) where σ2 is known and fixed and the unknown mean μ has a Normal (0,σ2/m) prior, where m is a given constant. Give a 95% credible interval for μ.
1. Let W1, . . . , Wn be i.i.d. N(µ, σ2). A 100(1 − α)%...
1. Let W1, . . . , Wn be i.i.d. N(µ, σ2). A 100(1 − α)% confidence interval for σ2 is ( ([n − 1]*S2 ) / (χ21−α/2,n−1 ) , ([n − 1]S2) / ( χ2α/2,n−1) , where χ2u,k denotes the 100×uth percentile of the χ2 distribution with k degrees of freedom, S2 = (n − 1)−1 * the sum of (Wi − Wbar )2 (from 1 to n) is the sample variance and Wbar = n−1 * the sum...
Make a simulation to verify the following theorem: if X1 ∼ N(µ1, (σ 1)^2 ), X2...
Make a simulation to verify the following theorem: if X1 ∼ N(µ1, (σ 1)^2 ), X2 ∼ N(µ2, (σ2)^ 2 ), and X1 and X2 are independent, then X1 + X2 ∼ N(µ1 + µ2, (σ2)^2+ (σ 2 )^2 ).
Suppose X1; : : : ; Xn is i.i.d Exponential distribution with density f(xjθ) = (1/θ)...
Suppose X1; : : : ; Xn is i.i.d Exponential distribution with density f(xjθ) = (1/θ) * e(-x/θ); 0 ≤ x < 1; θ > 0: (a) Find the UMVUE (the best unbiased estimator) of θ. (b) What is the Cramer-Rao lower bound of all unbiased estimator of all unbiased estimator of θ. Does the estimator from (a) attain the lower bound? Justify your answer. (c) What is the Cramer-Rao lower bound of all unbiased estimator of θ^2? 3 (d)...
Let X1,...,Xn be i.i.d. N(θ,1), where θ ∈ R is the unknown parameter. (a) Find an...
Let X1,...,Xn be i.i.d. N(θ,1), where θ ∈ R is the unknown parameter. (a) Find an unbiased estimator of θ^2 based on (Xn)^2. (b) Calculate it’s variance and compare it with the Cram ́er-Rao lower bound.
Let X1, X2, . . . , Xn iid∼ N (µ, σ2 ). Consider the hypotheses...
Let X1, X2, . . . , Xn iid∼ N (µ, σ2 ). Consider the hypotheses H0 : µ = µ0 and H1 : µ (not equal)= µ0 and the test statistic (X bar − µ0)/ (S/√ n). Note that S has been used as σ is unknown. a. What is the distribution of the test statistic when H0 is true? b. What is the type I error of an α−level test of this type? Prove it. c. What is...
Suppose X1,  X2,  X3 are i.i.d. Exp( λ ), and that we observe the realizations X1 = 1.0,  X2...
Suppose X1,  X2,  X3 are i.i.d. Exp( λ ), and that we observe the realizations X1 = 1.0,  X2 = 2.0, and X3 = 3.0. What is the maximum likelihood estimate of Pr(X1> 2)? Please explain your steps/answers if possible.
1. . Let X1, . . . , Xn, Y1, . . . , Yn be...
1. . Let X1, . . . , Xn, Y1, . . . , Yn be mutually independent random variables, and Z = 1 n Pn i=1 XiYi . Suppose for each i ∈ {1, . . . , n}, Xi ∼ Bernoulli(p), Yi ∼ Binomial(n, p). What is Var[Z]? 2. There is a fair coin and a biased coin that flips heads with probability 1/4. You randomly pick one of the coins and flip it until you get a...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT