In: Statistics and Probability
This exercise will investigate some inferential questions about the intercept β0 from a simple linear model.
a. Show that E(Ȳ) = β0 + β1x̄.
b. Show that 0 is an unbiased estimator for β0.
c. Calculate the variance of 0. Hint: Recall that Ȳ and 1 are independent.
Simple Linear Regression
Model: Y = β0 + β1x +
• is the random error so Y is a random variable too.
Sample:
(x1, Y1),(x2, Y2), . . . ,(xn, Yn)
Each (xi
, Yi) satisfies Yi = β0 + β1xi + i
Least Squares Estimators:
βˆ
1 =
Pn
i=1(xi − x)(Yi − Y )
Pn
i=1(xi − x)
2
, βˆ
0 = Y − βˆ
1x
Proposition: The estimators βˆ
0 and βˆ
1 are unbiased;
that is,
E[βˆ
0] = β0, E[βˆ
1] = β1.
Proof:
βˆ
1 =
Pn
i=1(xi − x)(Yi − Y )
Pn
i=1(xi − x)
2
=
Pn
i=1(xi − x)Yi − Y
Pn
i=1(xi − x)
Pn
i=1(xi − x)
2
=
Pn
i=1
P
(xi − x)Yi
n
i=1(xi − x)
2
Thus
E[βˆ
1] = X
n
i=1
(xi − x)
Pn
i=1(xi − x)
2
E[Yi
]
=
X
n
i=1
(xi − x)
Pn
i=1(xi − x)
2
(β0 + β1xi)
=
Pn
i=1(xi − x)
Pn
i=1(xi − x)
2
β0 + β1
Pn
i=1
P
(xi − x)xi
n
i=1(xi − x)
2
E[βˆ
0] = E[Y − βˆ
1x]
=
1
n
X
n
i=1
E[Yi
] − E[βˆ
1]x
=
1
n
X
n
i=1
(β0 + β1xi) − β1
1
n
X
n
i=1
xiProposition: The variances of βˆ
0 and βˆ
1 are:
V (βˆ
0) = σ
2 Pn
i=1 x
2
P
i
n
i=1(xi − x)
2
=
σ
2 Pn
i=1 x
2
i
Sxx
and
V (βˆ
1) = σ
2
Pn
i=1(xi − x)
2
=
σ
2
Sxx
.
Proof:
V (βˆ
1) = V
Pn
i=1(xi − x)Yi
Sxx
=
1
Sxx2
X
n
i=1
(xi − x)
2V (Yi)
=
1
Sxx2
X
n
i=1
(xi − x)
2
!
σ
2
=
1
Sxx
σ
2V (βˆ
0) = V (Y − βˆ
1x)
= V (Y ) + V (−βˆ
1x) + 2Cov(Y , −xβˆ
1)
= V (Y ) + x
2V (βˆ
1) − 2xCov(Y , βˆ
1)
=
σ
2
n
+ x
2
σ
2
Sxx
− 2xCov(Y , βˆ
1)
Now let’s evaluate the covariance term:
Cov(Y , βˆ
1) = Cov
X
n
i=1
1
n
Yi
,
X
n
j=1
xj − x
Sxx
Yj
=
X
n
i=1
X
n
j=1
xj − x
nSxx
Cov(Yi
, Yj)
=
X
n
i=1
xi − x
nSxx
σ
2 + 0
= 0
Thus
V (βˆ
0) = σ
2
n
+ x
2
σ
2
Sxx
= σ
2 Sxx + nx
2
nSxx
= σ
2
Pn
i=1(xi − x)
2 + nx
2
nSxx
= σ
2
Pn
i=1(x
2
i − 2xxi + x
2
) + nx
2
nSxx
= σ
2
Pn
i=1 x
2
i
nSxx