Question

In: Statistics and Probability

This exercise will investigate some inferential questions about the intercept β0 from a simple linear model....

This exercise will investigate some inferential questions about the intercept β0 from a simple linear model.

a. Show that E(Ȳ) = β0 + β1x̄.

b. Show that 0 is an unbiased estimator for β0.

c. Calculate the variance of 0. Hint: Recall that Ȳ and 1 are independent.

Solutions

Expert Solution

Simple Linear Regression
Model: Y = β0 + β1x +
• is the random error so Y is a random variable too.
Sample:
(x1, Y1),(x2, Y2), . . . ,(xn, Yn)
Each (xi
, Yi) satisfies Yi = β0 + β1xi + i
Least Squares Estimators:
βˆ
1 =
Pn
i=1(xi − x)(Yi − Y )
Pn
i=1(xi − x)
2
, βˆ
0 = Y − βˆ
1x

Proposition: The estimators βˆ
0 and βˆ
1 are unbiased;
that is,
E[βˆ
0] = β0, E[βˆ
1] = β1.
Proof:
βˆ
1 =
Pn
i=1(xi − x)(Yi − Y )
Pn
i=1(xi − x)
2
=
Pn
i=1(xi − x)Yi − Y
Pn
i=1(xi − x)
Pn
i=1(xi − x)
2
=
Pn
i=1
P
(xi − x)Yi
n
i=1(xi − x)
2
Thus
E[βˆ
1] = X
n
i=1
(xi − x)
Pn
i=1(xi − x)
2
E[Yi
]
=
X
n
i=1
(xi − x)
Pn
i=1(xi − x)
2
(β0 + β1xi)
=
Pn
i=1(xi − x)
Pn
i=1(xi − x)
2
β0 + β1
Pn
i=1
P
(xi − x)xi
n
i=1(xi − x)
2
E[βˆ
0] = E[Y − βˆ
1x]
=
1
n
X
n
i=1
E[Yi
] − E[βˆ
1]x
=
1
n
X
n
i=1
(β0 + β1xi) − β1
1
n
X
n
i=1
xiProposition: The variances of βˆ
0 and βˆ
1 are:
V (βˆ
0) = σ
2 Pn
i=1 x
2
P
i
n
i=1(xi − x)
2
=
σ
2 Pn
i=1 x
2
i
Sxx
and
V (βˆ
1) = σ
2
Pn
i=1(xi − x)
2
=
σ
2
Sxx
.
Proof:
V (βˆ
1) = V
Pn
i=1(xi − x)Yi
Sxx
=

1
Sxx2
X
n
i=1
(xi − x)
2V (Yi)
=

1
Sxx2
X
n
i=1
(xi − x)
2
!
σ
2
=

1
Sxx
σ
2V (βˆ
0) = V (Y − βˆ
1x)
= V (Y ) + V (−βˆ
1x) + 2Cov(Y , −xβˆ
1)
= V (Y ) + x
2V (βˆ
1) − 2xCov(Y , βˆ
1)
=
σ
2
n
+ x
2

σ
2
Sxx
− 2xCov(Y , βˆ
1)
Now let’s evaluate the covariance term:
Cov(Y , βˆ
1) = Cov


X
n
i=1
1
n
Yi
,
X
n
j=1
xj − x
Sxx
Yj


=
X
n
i=1
X
n
j=1
xj − x
nSxx
Cov(Yi
, Yj)
=
X
n
i=1
xi − x
nSxx
σ
2 + 0
= 0
Thus
V (βˆ
0) = σ
2
n
+ x
2

σ
2
Sxx
= σ
2 Sxx + nx
2
nSxx
= σ
2
Pn
i=1(xi − x)
2 + nx
2
nSxx
= σ
2
Pn
i=1(x
2
i − 2xxi + x
2
) + nx
2
nSxx
= σ
2
Pn
i=1 x
2
i
nSxx


Related Solutions

simple linear regression proof of variance of intercept estiamtor β0
simple linear regression proof of variance of intercept estiamtor β0
In a simple linear regression model yi = β0 + β1xi + εi with the usual...
In a simple linear regression model yi = β0 + β1xi + εi with the usual assumptions show algebraically that the least squares estimator β̂0 = b0 of the intercept has mean β0 and variance σ2[(1/n) + x̄2 / Sxx].
6. In the simple linear regression model, the y-intercept represents the: a. change in y per...
6. In the simple linear regression model, the y-intercept represents the:a. change in y per unit change in x.b. change in x per unit change in y.c. value of y when x=0.d. value of x when y=07. In the simple linear regression model, the slope represents the:a. value of y when x=0.b. average change in y per unit change in x.c. value of x when y=0.d. average change in x per unit change in y.8. In regression analysis, the residuals...
Consider the simple linear regression: Yi = β0 + β1Xi + ui whereYi and Xi...
Consider the simple linear regression: Yi = β0 + β1Xi + ui where Yi and Xi are random variables, β0 and β1 are population intercept and slope parameters, respectively, ui is the error term. Suppose the estimated regression equation is given by: Yˆ i = βˆ 0 + βˆ 1Xi where βˆ 0 and βˆ 1 are OLS estimates for β0 and β1. Define residuals ˆui as: uˆi = Yi − Yˆ i Show that: (a) (2 pts.) Pn i=1...
How are the slope and intercept of a simple linear regression line calculated? What do they...
How are the slope and intercept of a simple linear regression line calculated? What do they tell us about the relationship between the two variables? Give example of problem.
How are the slope and intercept of a simple linear regression line calculated? What do they...
How are the slope and intercept of a simple linear regression line calculated? What do they tell us about the relationship between the two variables? Also, give an example.
(i) Consider a simple linear regression yi = β0 + β1xi + ui Write down the...
(i) Consider a simple linear regression yi = β0 + β1xi + ui Write down the formula for the estimated standard error of the OLS estimator and the formula for the White heteroskedasticity-robust standard error on the estimated coefficient bβ1. (ii) What is the intuition for the White test for heteroskedasticity? (You do not need to describe how the test is implemented in practice.)
Why does the intercept estimator in simple linear regression follow a normal distribution. Justify this by...
Why does the intercept estimator in simple linear regression follow a normal distribution. Justify this by using the appropriate assumptions of simple linear regression.
Consider the simple regression model: Yi = β0 + β1Xi + e (a) Explain how the...
Consider the simple regression model: Yi = β0 + β1Xi + e (a) Explain how the Ordinary Least Squares (OLS) estimator formulas for β0 and β1are derived. (b) Under the Classical Linear Regression Model assumptions, the ordinary least squares estimator, OLS estimators are the “Best Linear Unbiased Estimators (B.L.U.E.).” Explain. (c) Other things equal, the standard error of will decline as the sample size increases. Explain the importance of this.
Consider the simple regression model: !Yi = β0 + β1Xi + ei (a) Explain how the...
Consider the simple regression model: !Yi = β0 + β1Xi + ei (a) Explain how the Ordinary Least Squares (OLS) estimator formulas for β̂0 and β̂1 are derived. (b) Under the Classical Linear Regression Model assumptions, the ordinary least squares estimator, OLS estimators are the “Best Linear Unbiased Estimators (B.L.U.E.).” Explain. (c) Other things equal, the standard error of β̂1 will decline as the sample size increases. Explain the importance of this.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT