In: Statistics and Probability
1. . Let X1, . . . , Xn, Y1, . . . , Yn be mutually independent random variables, and Z = 1 n Pn i=1 XiYi . Suppose for each i ∈ {1, . . . , n}, Xi ∼ Bernoulli(p), Yi ∼ Binomial(n, p). What is Var[Z]?
2. There is a fair coin and a biased coin that flips heads with probability 1/4. You randomly pick one of the coins and flip it until you get a heads. Let X be the number of flips you need. Compute E[X] and Var[X].
3. A man has a set of n keys, one of which fits the door to his apartment. He tries the keys randomly, throwing away each ill-fitting key that he tries until he finds the key that fits. That is, he chooses keys randomly from among those he has not yet tried. This way he is sure to find the right key within n tries. Let T be the number of times he tries keys until he finds the right key. Write closed-form expressions for E[T] and Var[T]. Hint: Write T as a linear combination of indicator variables.
4. (a) What is the probability that a 5-card poker hand has at least three spades? (b) What upper bound does Markov’s Theorem give for this probability? (c) What upper bound does Chebyshev’s Theorem give for this probability?
5. A random variable X is always strictly larger than −100. You know that E[X] = −60. Give the best upper bound you can on P(X ≥ −20).
6. Suppose that we roll a standard fair die 100 times. Let X be the sum of the numbers that appear over the 100 rolls. Use Chebyshev’s inequality to bound P[|X − 350| ≥ 50].
7. Given any two random variables X and Y , by the linearity of expectation we have E[X −Y ] = E[X]−E[Y ]. Prove that, when X and Y are independent, Var[X − Y ] = Var[X] + Var[Y].