Question

In: Statistics and Probability

Let X ∼Pois(µ). (a) Find an unbiased estimator of µ. Hint: recall that E(X) = µ....

Let X ∼Pois(µ).

(a) Find an unbiased estimator of µ. Hint: recall that E(X) = µ.

(b) What is the standard deviation of your estimator? Also recall that σ2X = µ for Poisson r.v.’s.

Solutions

Expert Solution

Recall that if Xi is a normally distributed random variable with mean μ and variance σ2, then E(Xi) = μ and Var(Xi) = σ2 . Therefore:

E(X¯)=E(1n∑i=1nXi)=1n∑i=1nE(Xi)=1n∑i=1μ=1n(nμ)=μ

The first equality holds because we've merely replaced X-bar with its definition. Again, the second equality holds by the rules of expectation for a linear combination. The third equality holds because E(Xi) = μ. The fourth equality holds because when you add the value μ up n times, you get nμ. And, of course, the last equality is simple algebra.

In summary, we have shown that:

E(X¯)=μ

Therefore, the maximum likelihood estimator of μ is unbiased. Now, let's check the maximum likelihood estimator of σ2. First, note that we can rewrite the formula for the MLE as:

σ^2=(1n∑i=1nX2i)−X¯2

because:

Then, taking the expectation of the MLE, we get:

E(σ^2)=(n−1)σ2n

as illustrated here:

E(σ^2)=E[1n∑i=1nX2i−X¯2]=[1n∑i=1nE(X2i)]−E(X¯2)=1n∑i=1n(σ2+μ2)−(σ2n+μ2)=1n(nσ2+nμ2)−σ2n−μ2=σ2−σ2n=nσ2−σ2n=(n−1)σ2n

The first equality holds from the rewritten form of the MLE. The second equality holds from the properties of expectation. The third equality holds from manipulating the alternative formulas for the variance, namely:

Var(X)=σ2=E(X2)−μ2and   Var(X¯)=σ2n=E(X¯2)−μ2

The remaining equalities hold from simple algebraic manipulation. Now, because we have shown:

E(σ^2)≠σ2

the maximum likelihood estimator of σ2 is a biased estimator.


Related Solutions

1. X-bar is an unbiased estimator of µ because a. standard error of X-bar equals sigma/...
1. X-bar is an unbiased estimator of µ because a. standard error of X-bar equals sigma/ square root of n b. expected value of X-bar equals µ c. shape of distribution of X-bar is normal. d. Expected value of X-bar is greater than 1. 2. Suppose professor finds that for 16 randomly selected students the time needed to complete an assignment had a sample mean of 65 minutes and a sample standard deviation of 40 minutes. The 98% confidence interval...
Show that OLS estimator of variance is an unbiased estimator?
Show that OLS estimator of variance is an unbiased estimator?
Let X be a random variable with CDF F(x) = e-e(µ-x)/β, where β > 0 and...
Let X be a random variable with CDF F(x) = e-e(µ-x)/β, where β > 0 and -∞ < µ, x < ∞. 1. What is the median of X? 2. Obtain the PDF of X. Use R to plot, in the range -10<x<30, the pdf for µ = 2, β = 5. 3. Draw a random sample of size 1000 from f(x) for µ = 2, β = 5 and draw a histogram of the values in the random sample...
find the taylor polynomial T5 of f(x) = e^x sinx with the center c=0 hint: e^x...
find the taylor polynomial T5 of f(x) = e^x sinx with the center c=0 hint: e^x = 1 + x + x^2/2! + x^3/3! + ..., sinx=x - x^3/3! + x^5/5! -...
does the "unbiased" aspect of unbiased estimator indicate that it underestimates the population value with same...
does the "unbiased" aspect of unbiased estimator indicate that it underestimates the population value with same tenency as it overestimates the population value, or not?
The Gauss-Markov theorem says that the OLS estimator is the best linear unbiased estimator.
The Gauss-Markov theorem says that the OLS estimator is the best linear unbiased estimator. Explain which assumptions are needed in order to verify Gauss-Markov theorem? Consider the Cobb-Douglas production function
Suppose that D and E are sets, and D ⊆ E. Let A = P(E). Recall...
Suppose that D and E are sets, and D ⊆ E. Let A = P(E). Recall that P(E) denotes the set of all subsets of E. Define a relation R on A by R = {(X, Y) ∈ A × A: [(X − Y) ∪ (Y − X)] ⊆ D}. So, XRY if and only if [(X−Y) ∪ (Y −X)] ⊆ D. Prove that R is an equivalence relation on A.
Suppose that Yi=?0+?1Xi+ui and that E[ui|Xi] = 0 and therefore OLS is an unbiased estimator. a)...
Suppose that Yi=?0+?1Xi+ui and that E[ui|Xi] = 0 and therefore OLS is an unbiased estimator. a) Show that Zi=Xi is a valid instrument for Xi , i.e. it is both relevant and exogenous. b) Show that the 2SLS estimator of ?1 using Xi as an instrument for Xi is exactly equal to the OLS estimator of ?1 c) Let Zi=X2i and assume Xi is normally distributed N(?,?²). Is Zi exogenous? Is Zi relevant? Explain how the answer to these questions...
Let X and Y be random variables with finite means. Show that min g(x) E(Y−g(X))^2=E(Y−E(Y|X))^2 Hint:...
Let X and Y be random variables with finite means. Show that min g(x) E(Y−g(X))^2=E(Y−E(Y|X))^2 Hint: a−b = a−c+c−b
Let f(x) = a(e-2x – e-6x), for x ≥ 0, and f(x)=0 elsewhere. a) Find a...
Let f(x) = a(e-2x – e-6x), for x ≥ 0, and f(x)=0 elsewhere. a) Find a so that f(x) is a probability density function b)What is P(X<=1)
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT