In: Economics
1. Consider the linear regression model for a random sample of size n: yi = β0 + vi ; i = 1, . . . , n, where v is a random error term. Notice that this model is equivalent to the one seen in the classroom, but without the slope β1.
(a) State the minimization problem that leads to the estimation of β0.
(b) Construct the first-order condition to compute a minimum from the above objective function and use it to obtain the expression of the ordinary least squares estimator of
β0. Call this estimator β˜0.
(c) Suppose now that the true population model is given by: Y = β0 + β1X + u, where u is an error term which satisfies E(u) = 0, E(u|X) = 0 and Var(u|X) = σ^2
(homoskedasticity). This implies that, in our sample, yi = β0 + β1xi + ui ;i = 1, . . . , n. Show that, in general, β˜0 is a biased estimator of β0. When is the bias equal to 0?
(a) The minimization problem would concer the minimization of the residual sum of squared. We have or . Hence, the minimization problem would be as below.
,
ie the minimization of RSS with respect to the parameter beta0.
(b) The FOC would be as below.
or or or or or or or .
(c) For the true regression model be , the estimate of beta0 here would be as . We have or , and since the true model is , and slope coefficient is unbiased, we have .
Since we have , hence or , the is a biased estimator of true beta0.
The bias is or or . The bias would be zero when or , ie for and/or .
Hence, if mean of independent variable is zero and/or the true slope coefficient is zero, then the bias is zero.