In: Statistics and Probability
Let X ∼Pois(µ).
(a) Find an unbiased estimator of µ. Hint: recall that E(X) = µ.
(b) What is the standard deviation of your estimator? Also recall that σ2X = µ for Poisson r.v.’s.
Recall that if Xi is a normally distributed random variable with mean μ and variance σ2, then E(Xi) = μ and Var(Xi) = σ2 . Therefore:
E(X¯)=E(1n∑i=1nXi)=1n∑i=1nE(Xi)=1n∑i=1μ=1n(nμ)=μ
The first equality holds because we've merely replaced X-bar with its definition. Again, the second equality holds by the rules of expectation for a linear combination. The third equality holds because E(Xi) = μ. The fourth equality holds because when you add the value μ up n times, you get nμ. And, of course, the last equality is simple algebra.
In summary, we have shown that:
E(X¯)=μ
Therefore, the maximum likelihood estimator of μ is unbiased. Now, let's check the maximum likelihood estimator of σ2. First, note that we can rewrite the formula for the MLE as:
σ^2=(1n∑i=1nX2i)−X¯2
because:
Then, taking the expectation of the MLE, we get:
E(σ^2)=(n−1)σ2n
as illustrated here:
E(σ^2)=E[1n∑i=1nX2i−X¯2]=[1n∑i=1nE(X2i)]−E(X¯2)=1n∑i=1n(σ2+μ2)−(σ2n+μ2)=1n(nσ2+nμ2)−σ2n−μ2=σ2−σ2n=nσ2−σ2n=(n−1)σ2n
The first equality holds from the rewritten form of the MLE. The second equality holds from the properties of expectation. The third equality holds from manipulating the alternative formulas for the variance, namely:
Var(X)=σ2=E(X2)−μ2and Var(X¯)=σ2n=E(X¯2)−μ2
The remaining equalities hold from simple algebraic manipulation. Now, because we have shown:
E(σ^2)≠σ2
the maximum likelihood estimator of σ2 is a biased estimator.