Question

In: Statistics and Probability

1. . Let X1, . . . , Xn, Y1, . . . , Yn be...

1. . Let X1, . . . , Xn, Y1, . . . , Yn be mutually independent random variables, and Z = 1 n Pn i=1 XiYi . Suppose for each i ∈ {1, . . . , n}, Xi ∼ Bernoulli(p), Yi ∼ Binomial(n, p). What is Var[Z]?

2. There is a fair coin and a biased coin that flips heads with probability 1/4. You randomly pick one of the coins and flip it until you get a heads. Let X be the number of flips you need. Compute E[X] and Var[X].

3. A man has a set of n keys, one of which fits the door to his apartment. He tries the keys randomly, throwing away each ill-fitting key that he tries until he finds the key that fits. That is, he chooses keys randomly from among those he has not yet tried. This way he is sure to find the right key within n tries. Let T be the number of times he tries keys until he finds the right key. Write closed-form expressions for E[T] and Var[T]. Hint: Write T as a linear combination of indicator variables.

4. (a) What is the probability that a 5-card poker hand has at least three spades? (b) What upper bound does Markov’s Theorem give for this probability? (c) What upper bound does Chebyshev’s Theorem give for this probability?

5. A random variable X is always strictly larger than −100. You know that E[X] = −60. Give the best upper bound you can on P(X ≥ −20).

6. Suppose that we roll a standard fair die 100 times. Let X be the sum of the numbers that appear over the 100 rolls. Use Chebyshev’s inequality to bound P[|X − 350| ≥ 50].

7. Given any two random variables X and Y , by the linearity of expectation we have E[X −Y ] = E[X]−E[Y ]. Prove that, when X and Y are independent, Var[X − Y ] = Var[X] + Var[Y].

Solutions

Expert Solution


Related Solutions

3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and...
3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and Y1...Xn independent of each other. Assume that the common population SD σ is known but the two means are not. Consider testing the hypothesis null: μx = μy vs alternative: μx ≠ μy. d. Assume σ=1 and n=20. How large must δ be for the size 0.01 test to have power at least 0.99? e. Assume σ=1and δ=0.2. How large must n be for...
3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and...
3. Let X1...Xn be N(μX,σ) and Y1...Yn be iid N(μy,σ) with the two samples X1...Xn, and Y1...Xn independent of each other. Assume that the common population SD σ is known but the two means are not. Consider testing the hypothesis null: μx = μy vs alternative: μx ≠ μy. a. Find the likelihood ratio test statistic Λ. Specify which MLEs you are using and how you plug them in.
Consider a random sample (X1, Y1), (X2, Y2), . . . , (Xn, Yn) where Y...
Consider a random sample (X1, Y1), (X2, Y2), . . . , (Xn, Yn) where Y | X = x is modeled by Y=β0+β1x+ε, ε∼N(0,σ^2), where β0,β1and σ^2 are unknown. Let β1 denote the mle of β1. Derive V(βhat1).
Let (xn), (yn) be bounded sequences. a) Prove that lim inf xn + lim inf yn...
Let (xn), (yn) be bounded sequences. a) Prove that lim inf xn + lim inf yn ≤ lim inf(xn + yn) ≤ lim sup(xn + yn) ≤ lim sup xn + lim sup yn. Give example where all inequalities are strict. b)Let (zn) be the sequence defined recursively by z1 = z2 = 1, zn+2 = √ zn+1 + √ zn, n = 1, 2, . . . . Prove that (zn) is convergent and find its limit. Hint; argue...
Let x1 > 1 and xn+1 := 2−1/xn for n ∈ N. Show that xn is...
Let x1 > 1 and xn+1 := 2−1/xn for n ∈ N. Show that xn is bounded and monotone. Find the limit. Prove by induction
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi...
Let X1,...,Xn be independent random variables,and let X=X1+...+Xn be their sum. 1. Suppose that each Xi is geometric with respective parameter pi. It is known that the mean of X is equal to μ, where μ > 0. Show that the variance of X is minimized if the pi's are all equal to n/μ. 2. Suppose that each Xi is Bernoulli with respective parameter pi. It is known that the mean of X is equal to μ, where μ >...
Give a counterexample: a) Xn + Yn converges if and only if both Xn and Yn...
Give a counterexample: a) Xn + Yn converges if and only if both Xn and Yn converge. b) Xn Yn converges if and only if both Xn and Yn converge.
Suppose that X1, X2, , Xm and Y1, Y2, , Yn are independent random samples, with...
Suppose that X1, X2, , Xm and Y1, Y2, , Yn are independent random samples, with the variables Xi normally distributed with mean μ1 and variance σ12 and the variables Yi normally distributed with mean μ2 and variance σ22. The difference between the sample means, X − Y, is then a linear combination of m + n normally distributed random variables and, by this theorem, is itself normally distributed. (a) Find E(X − Y). (b) Find V(X − Y). (c)...
Let Y1,...,Yn be a sample from the Uniform density on [0,2θ]. Show that θ = max(Y1,...
Let Y1,...,Yn be a sample from the Uniform density on [0,2θ]. Show that θ = max(Y1, . . . , Yn) is a sufficient statistic for θ. Find a MVUE (Minimal Variance Unbiased Estimator) for θ.
Let Y1, Y2, . . ., Yn be a random sample from a uniform distribution on...
Let Y1, Y2, . . ., Yn be a random sample from a uniform distribution on the interval (θ - 2, θ). a) Show that Ȳ is a biased estimator of θ. Calculate the bias. b) Calculate MSE( Ȳ). c) Find an unbiased estimator of θ. d) What is the mean square error of your unbiased estimator? e) Is your unbiased estimator a consistent estimator of θ?
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT