Question

In: Advanced Math

R simulation: Let X1, . . . , Xn be i.i.d. random variables from a uniform...

R simulation:
Let X1, . . . , Xn be i.i.d. random variables from a uniform distribution on [0, 2]. Generate
and plot 10 paths of sample means from n = 1 to n = 40 in one figure for each case. Give
some comments to empirically check the Law of Large Numbers.

(a) When n is large,
X1 + · · · + Xn/n  converges to E[Xi].
(b) When n is large,
X1^2+ · · · + Xn^2/n converges to E[Xi^2 ]

Solutions

Expert Solution

CODE ::1

PLOT 1::

CODE 2 ::

PLOT 2 ::

PLEASE RATE IF HELPFUL, THANK YOU


Related Solutions

6.42 Let X1,..., Xn be an i.i.d. sequence of Uniform (0,1) random variables. Let M =...
6.42 Let X1,..., Xn be an i.i.d. sequence of Uniform (0,1) random variables. Let M = max(X1,...,Xn). (a) Find the density function of M. (b) Find E[M] and V[M].
Let X1, . . . , Xn be i.i.d. samples from Uniform(0, θ). Show that for...
Let X1, . . . , Xn be i.i.d. samples from Uniform(0, θ). Show that for any α ∈ (0, 1), there is a cn,α, such that [max(X1,...,Xn),cn,α max(X1,...,Xn)] is a 1−α confidence interval of θ.
Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class...
Let X1,...,Xn be i.i.d. random variables with mean 0 and variance 2 > 0. In class we have shown a central limit theorem, ¯ Xn /pn )N(0,1), as n!1 , (1) with the assumption E(X1) = 0. Using (1), we now prove the theorem for a more general E(X1)=µ6=0 case. Now suppose X1,...,Xn are i.i.d. random variables with mean µ6= 0 and variance 2. (a) Show that for dummy random variables Yi = Xi µ, E(Yi) = 0 and V...
Let X1, . . . , Xn be a random sample from a uniform distribution on...
Let X1, . . . , Xn be a random sample from a uniform distribution on the interval [a, b] (i) Find the moments estimators of a and b. (ii) Find the MLEs of a and b.
6. Let X1; : : : ;Xn be i.i.d. samples from Uniform(0;theta ). (a) Find cn...
6. Let X1; : : : ;Xn be i.i.d. samples from Uniform(0;theta ). (a) Find cn such that Theta1 = cn min(X1; : : : ;Xn) is an unbiased estimator of theta (b) It is easy to show that theta2 = 2(X-bar) is also an unbiased estimator of theta(you do not need to show this). Compare theta1 and theta2. Which is a better estimator of theta? Specify your criteria.
Let X1, . . . , Xn i.i.d. Uniform(θ, θ + 1). Show that: ˆθ1 =...
Let X1, . . . , Xn i.i.d. Uniform(θ, θ + 1). Show that: ˆθ1 = X¯ − 1 2 and ˆθ2 = X(n) − n n + 1 are both consistent estimators for θ.
Let X1,X2,...,Xn be i.i.d. Gamma random variables with parameters α and λ. The likelihood function is...
Let X1,X2,...,Xn be i.i.d. Gamma random variables with parameters α and λ. The likelihood function is difficult to differentiate because of the gamma function. Rather than finding the maximum likelihood estimators, what are the method of moments estimators of both parameters α and λ?
Problem 1 Let X1, X2, . . . , Xn be independent Uniform(0,1) random variables. (...
Problem 1 Let X1, X2, . . . , Xn be independent Uniform(0,1) random variables. ( a) Compute the cdf of Y := min(X1, . . . , Xn). (b) Use (a) to compute the pdf of Y . (c) Find E(Y ).
Let X1,X2, . . . , Xn be a random sample from the uniform distribution with...
Let X1,X2, . . . , Xn be a random sample from the uniform distribution with pdf f(x; θ1, θ2) = 1/(2θ2), θ1 − θ2 < x < θ1 + θ2, where −∞ < θ1 < ∞ and θ2 > 0, and the pdf is equal to zero elsewhere. (a) Show that Y1 = min(Xi) and Yn = max(Xi), the joint sufficient statistics for θ1 and θ2, are complete. (b) Find the MVUEs of θ1 and θ2.
Let X1, ..., Xn be i.i.d random variables with the density function f(x|θ) = e^(θ−x) ,...
Let X1, ..., Xn be i.i.d random variables with the density function f(x|θ) = e^(θ−x) , θ ≤ x. a. Find the Method of Moment estimate of θ b. The MLE of θ (Hint: Think carefully before taking derivative, do we have to take derivative?)
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT