Question

In: Economics

Suppose you estimate the following regression model using OLS: Yi = β0 + β1Xi + β2Xi2...

Suppose you estimate the following regression model using OLS: Yi = β0 + β1Xi + β2Xi2 + β3Xi3+ ui. You estimate that the p-value of the F-test that β2= β3 =0 is 0.01. This implies:

options:

You can reject the null hypothesis that the regression function is linear.

You cannot reject the null hypothesis that the regression function is either quadratic or cubic.

The alternate hypothesis is that the regression function is either quadratic or cubic.

Both (a) and (c).

Solutions

Expert Solution

Answer: Both (a) and (c)

The null hypothesis here is . This is essentially the hypothesis that the regression function is linear. (that is, the only valid term is the slope of the term, and not the slope of the or terms).

A p-value of 0.01 means that the results are significant at 1% level of significance. Now, the question doesn't mention what level of significance is required, but usually 1% is sufficient to reject the null hypothesis (in social science atleast).

So you can reject the null hypothesis that the regression function is linear, given the p-value and the null hypothesis.

Next, if the null hypothesis is that the regression function is linear, then the alternative hypothesis must be of the following form:

This hypothesis says that either the slope of the quadratic term is different from 0 (in which case the equation is quadratic) or the slope of the cubic term is different from 0 (in which case the equation is cubic) or both are different from 0 (in which case also the equation is cubic). So the alternative hypothesis says that the equation is either quadratic or cubic.


Related Solutions

Suppose you estimate a simple linear regression Yi = β0 + β1Xi + ei. Next suppose...
Suppose you estimate a simple linear regression Yi = β0 + β1Xi + ei. Next suppose you estimate a regression only going through the origin, Yi = β ̃1Xi + ui. Which regression will give a smaller SSR (sum of squared residuals)? Why?
1. Consider the model Ci= β0+β1 Yi+ ui. Suppose you run this regression using OLS and...
1. Consider the model Ci= β0+β1 Yi+ ui. Suppose you run this regression using OLS and get the following results: b0=-3.13437; SE(b0)=0.959254; b1=1.46693; SE(b1)=21.0213; R-squared=0.130357; and SER=8.769363. Note that b0 and b1 the OLS estimate of b0 and b1, respectively. The total number of observations is 2950.According to these results the relationship between C and Y is: A. no relationship B. impossible to tell C. positive D. negative 2. Consider the model Ci= β0+β1 Yi+ ui. Suppose you run this...
Consider the model Ci= β0+β1 Yi+ ui. Suppose you run this regression using OLS and get...
Consider the model Ci= β0+β1 Yi+ ui. Suppose you run this regression using OLS and get the following results: b0=-3.13437; SE(b0)=0.959254; b1=1.46693; SE(b1)=0.0697828; R-squared=0.130357; and SER=8.769363. Note that b0 and b1 the OLS estimate of b0 and b1, respectively. The total number of observations is 2950. The following values are relevant for assessing goodness of fit of the estimated model with the exception of A. 0.130357 B. 8.769363 C. 1.46693 D. none of these
1. Consider the model Ci= β0+β1 Yi+ ui. Suppose you run this regression using OLS and...
1. Consider the model Ci= β0+β1 Yi+ ui. Suppose you run this regression using OLS and get the following results: b0=-3.13437; SE(b0)=0.959254; b1=1.46693; SE(b1)=0.0697828; R-squared=0.130357; and SER=8.769363. Note that b0 and b1 the OLS estimate of b0 and b1, respectively. The total number of observations is 2950. The number of degrees of freedom for this regression is A. 2952 B. 2948 C. 2 D. 2950 2. Consider the model Ci= β0+β1 Yi+ ui. Suppose you run this regression using OLS...
In a simple linear regression model yi = β0 + β1xi + εi with the usual...
In a simple linear regression model yi = β0 + β1xi + εi with the usual assumptions show algebraically that the least squares estimator β̂0 = b0 of the intercept has mean β0 and variance σ2[(1/n) + x̄2 / Sxx].
Consider a regression model Yi=β0+β1Xi+ui and suppose from a sample of 10 observations you are provided...
Consider a regression model Yi=β0+β1Xi+ui and suppose from a sample of 10 observations you are provided the following information: ∑10i=1Yi=71;  ∑10i=1Xi=42;  ∑10i=1XiYi=308; ∑10i=1X2i=196 Given this information, what is the predicted value of Y, i.e.,Yˆ for x = 12? 1. 14 2. 11 3. 13 4. 12 5. 15
Consider the simple regression model: Yi = β0 + β1Xi + e (a) Explain how the...
Consider the simple regression model: Yi = β0 + β1Xi + e (a) Explain how the Ordinary Least Squares (OLS) estimator formulas for β0 and β1are derived. (b) Under the Classical Linear Regression Model assumptions, the ordinary least squares estimator, OLS estimators are the “Best Linear Unbiased Estimators (B.L.U.E.).” Explain. (c) Other things equal, the standard error of will decline as the sample size increases. Explain the importance of this.
Consider the simple regression model: !Yi = β0 + β1Xi + ei (a) Explain how the...
Consider the simple regression model: !Yi = β0 + β1Xi + ei (a) Explain how the Ordinary Least Squares (OLS) estimator formulas for β̂0 and β̂1 are derived. (b) Under the Classical Linear Regression Model assumptions, the ordinary least squares estimator, OLS estimators are the “Best Linear Unbiased Estimators (B.L.U.E.).” Explain. (c) Other things equal, the standard error of β̂1 will decline as the sample size increases. Explain the importance of this.
Question 1: Consider the simple regression model: !Yi = β0 + β1Xi + ei (a) Explain...
Question 1: Consider the simple regression model: !Yi = β0 + β1Xi + ei (a) Explain how the Ordinary Least Squares (OLS) estimator formulas for !β̂ and !β̂ are derived. (b) Under the Classical Linear Regression Model assumptions, the ordinary least squares estimator, OLS estimators are the “Best Linear Unbiased Estimators (B.L.U.E.).” Explain. (c) Other things equal, the standard error of β! ̂ will decline as the sample size increases. Explain the importance of this. Question 2: Consider the following...
Consider the simple linear regression: Yi = β0 + β1Xi + ui whereYi and Xi...
Consider the simple linear regression: Yi = β0 + β1Xi + ui where Yi and Xi are random variables, β0 and β1 are population intercept and slope parameters, respectively, ui is the error term. Suppose the estimated regression equation is given by: Yˆ i = βˆ 0 + βˆ 1Xi where βˆ 0 and βˆ 1 are OLS estimates for β0 and β1. Define residuals ˆui as: uˆi = Yi − Yˆ i Show that: (a) (2 pts.) Pn i=1...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT