Question

In: Statistics and Probability

Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class...

Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class we have shown a central limit theorem, ¯ Xn /pn )N(0,1), as n!1 , (1) with the assumption E(X1) = 0. Using (1), we now prove the theorem for a more general E(X1)=µ6=0 case. Now suppose X1,...,Xn are i.i.d. random variables with mean µ6= 0 and variance 2. (a) Show that for dummy random variables Yi = Xi µ, E(Yi) = 0 and V ar(Yi)=2. (b) Show that ¯ Yn /pn = ¯ Xnµ /pn . (c) Based on (a) and (b), argue that the central limit theorem holds for µ6= 0.

Solutions

Expert Solution


Related Solutions

R simulation: Let X1, . . . , Xn be i.i.d. random variables from a uniform...
R simulation: Let X1, . . . , Xn be i.i.d. random variables from a uniform distribution on [0, 2]. Generate and plot 10 paths of sample means from n = 1 to n = 40 in one figure for each case. Give some comments to empirically check the Law of Large Numbers. (a) When n is large, X1 + · · · + Xn/n  converges to E[Xi]. (b) When n is large, X1^2+ · · · + Xn^2/n converges to...
6.42 Let X1,..., Xn be an i.i.d. sequence of Uniform (0,1) random variables. Let M =...
6.42 Let X1,..., Xn be an i.i.d. sequence of Uniform (0,1) random variables. Let M = max(X1,...,Xn). (a) Find the density function of M. (b) Find E[M] and V[M].
Let X1,X2,...,Xn be i.i.d. Gamma random variables with parameters α and λ. The likelihood function is...
Let X1,X2,...,Xn be i.i.d. Gamma random variables with parameters α and λ. The likelihood function is difficult to differentiate because of the gamma function. Rather than finding the maximum likelihood estimators, what are the method of moments estimators of both parameters α and λ?
Let X1, X2, . . . , Xn be iid Poisson random variables with unknown mean...
Let X1, X2, . . . , Xn be iid Poisson random variables with unknown mean µ 1. Find the maximum likelihood estimator of µ 2.Determine whether the maximum likelihood estimator is unbiased for µ
Let X1,…, Xn be a sample of iid N(0, ?) random variables with Θ=(0, ∞). Determine...
Let X1,…, Xn be a sample of iid N(0, ?) random variables with Θ=(0, ∞). Determine a) the MLE ? of ?. b) E(? ̂). c) the asymptotic variance of the MLE of ?. d) the MLE of SD(Xi ) = √ ?.
Let X1, ..., Xn be i.i.d random variables with the density function f(x|θ) = e^(θ−x) ,...
Let X1, ..., Xn be i.i.d random variables with the density function f(x|θ) = e^(θ−x) , θ ≤ x. a. Find the Method of Moment estimate of θ b. The MLE of θ (Hint: Think carefully before taking derivative, do we have to take derivative?)
Let X1, . . . , Xn be i.i.d. samples from Uniform(0, θ). Show that for...
Let X1, . . . , Xn be i.i.d. samples from Uniform(0, θ). Show that for any α ∈ (0, 1), there is a cn,α, such that [max(X1,...,Xn),cn,α max(X1,...,Xn)] is a 1−α confidence interval of θ.
2. Let X1, X2, . . . , Xn be independent, uniformly distributed random variables on...
2. Let X1, X2, . . . , Xn be independent, uniformly distributed random variables on the interval [0, θ]. (a) Find the pdf of X(j) , the j th order statistic. (b) Use the result from (a) to find E(X(j)). (c) Use the result from (b) to find E(X(j)−X(j−1)), the mean difference between two successive order statistics. (d) Suppose that n = 10, and X1, . . . , X10 represents the waiting times that the n = 10...
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi...
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi is geometric with respective parameter pi. It is known that the mean of X is equal to μ, where μ > 0. Show that the variance of X is minimized if the pi's are all equal to n/μ. 2. Suppose that each Xi is Bernoulli with respective parameter pi. It is known that the mean of X is equal to μ, where μ >...
6. Let X1; : : : ;Xn be i.i.d. samples from Uniform(0;theta ). (a) Find cn...
6. Let X1; : : : ;Xn be i.i.d. samples from Uniform(0;theta ). (a) Find cn such that Theta1 = cn min(X1; : : : ;Xn) is an unbiased estimator of theta (b) It is easy to show that theta2 = 2(X-bar) is also an unbiased estimator of theta(you do not need to show this). Compare theta1 and theta2. Which is a better estimator of theta? Specify your criteria.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT