Question

In: Statistics and Probability

When I ran a bivariate regression, I got the following table Coefficients: Estimate Std. Error z...

When I ran a bivariate regression, I got the following table

Coefficients:
Estimate Std. Error z value Pr(>|z|)
(Intercept) 10.8681 2.8754 3.780 0.000157 ***
ETHWAR -1.0170 0.4524 -2.248 0.024570 *

When I ran a multivariate regression, I got the following table

Coefficients:
Estimate Std. Error z value Pr(>|z|)
(Intercept) 10.811 2.987 3.619 0.000296 ***
ETHWAR -13.804 4844.876 -0.003 0.997727
CIVTOT 12.730 4844.877 0.003 0.997903

Why did the p-value for ETHWAR change? And why did it change so dramatically?

Solutions

Expert Solution

Ans-

As small the p-value (< 0.05 at 5% level of signficance) indicates that you can reject the null hypothesis. In other words, a predictor that has a low p-value is likely to be a meaningful addition to your model because changes in the predictor's value are related to changes in the response variable.

Conversely, a larger (insignificant) p-value suggests that changes in the predictor are not associated with changes in the response.

In the case of bivariate regression , we can see that the predictor variables ETHWAR is 0.024570< 0.05 it means ETHWAR is significant because the p-values<0.05

but in case of multiple regression  we can see that the predictor variables of ETHWAR and CIVTOT are not significant because both of their p-values are greater than 0.05 at alpha level of 0.05, which indicates that it is not statistically significant.

it may be happen because of wrong statistical tool , and it may happen that CIVTOT having more impact on dependent variable.


Related Solutions

1.-Interpret the following regression model Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -7.819e+05 7.468e+04 -10.470...
1.-Interpret the following regression model Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -7.819e+05 7.468e+04 -10.470 < 2e-16 *** Lot.Size -5.359e-01 1.163e-01 -4.610 4.67e-06 *** Square.Feet 1.108e+02 1.109e+01 9.986 < 2e-16 *** Num.Baths 2.985e+04 9.650e+03 3.094 0.00204 ** API.2011 1.226e+03 9.034e+01 13.568 < 2e-16 *** dis_coast -7.706e+00 2.550e+00 -3.022 0.00259 ** dis_fwy 1.617e+01 1.232e+01 1.312 0.18995 dis_down 5.364e+00 3.299e+00 1.626 0.10429 I(dis_fwy * dis_down) -4.414e-04 5.143e-04 -0.858 0.39098 Pool 1.044e+05 2.010e+04 5.194 2.59e-07 *** --- Signif. codes: 0 ‘***’ 0.001...
The final part of the multiple regression output is the coefficients table that represents the following:...
The final part of the multiple regression output is the coefficients table that represents the following: The unstandardized regression coefficient (B). The standardized regression coefficient (beta or β). t and p values. All the above. Three correlation coefficients are displayed in the coefficients table. They include the following: The zero order correlation coefficient. The partial correlation coefficient. The part correlation coefficient. All of the above. If the value for tolerance is acceptable, one should proceed with interpreting the: Model summary....
Write a few sentences comparing bivariate correlation and bivariate regression. You need to discuss when it...
Write a few sentences comparing bivariate correlation and bivariate regression. You need to discuss when it is appropriate to use each of these statistics.
> fm1 <- lm(ascorbic ~ pct.dry + cultB.id + cultC.id, data=lima) > summary(fm1) Coefficients: Estimate Std....
> fm1 <- lm(ascorbic ~ pct.dry + cultB.id + cultC.id, data=lima) > summary(fm1) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 213.2 16.3 13.1 4.64e-08 *** pct.dry    -3.9 0.43 -9.1 1.96e-06 *** cultB.id -6.2 5.53 -1.1 0.290 cultC.id    20.5 5.42 3.8 0.003 ** Signif. codes: 0 *** 0.001 ** 0.01 * 0.05 . 0.1 1 Residual standard error: -- intentionally omitted -Multiple R-squared: 0.91, Adjusted R-squared: 0.88 F-statistic: 36.84 on 3 and 11 DF, p-value: 4.956e-06 (c) Determine...
How do I explain the following regression result in terms of the coefficients of each dependent...
How do I explain the following regression result in terms of the coefficients of each dependent variable on the independent variable which is revenue SUMMARY OUTPUT Regression Statistics Multiple R 0.997839 R Square 0.995683 Adjusted R Square 0.990286 Standard Error 753750.6 Observations 10 ANOVA df SS MS F Significance F Regression 5 5.241E+14 1.048E+14 184.4968493 8.11978E-05 Residual 4 2.27256E+12 5.681E+11 Total 9 5.26373E+14 Coefficients Standard Error t Stat P-value Lower 95% Intercept 1866377 824571.4499 2.2634507 0.086350341 -423000.5781 SQFT (x1) 186.4999...
Fill in the empty cells in the following table in order to calculate the standard error of the estimate.
  Fill in the empty cells in the following table in order to calculate the standard error of the estimate.  X Y       3 3 6.00 -3.00 9.00 6 9 7.50 1.50 2.25 5 8 7.00 1.00 1.00 4 3 6.50 3.50 12.25 7 10 8.00 2.00 4.00 5 9 7.00 2.00 4.00 --------------- --------------- --------------- --------------- Sum = 32.5 Note: Sum in the last column is , which is SSresidual. F. Use the formula and information on...
If I ran a multivariate regression analysis for the effect of independent variables X and Y...
If I ran a multivariate regression analysis for the effect of independent variables X and Y on dependent variable A, that produced an adjusted R^2 of .0553, then added the independent variable Z to the analysis and got an adjusted R^2 of .0550, would that decrease in the adjusted R^2 translate to the independent variable Z not being a strong predictor of the dependent variable A? If it were a strong predictor of A would the adjusted R^2 increase?
a) Run a regression analysis on the following bivariate set of data with y as the...
a) Run a regression analysis on the following bivariate set of data with y as the response variable. x y 10.7 81.6 13.7 81.5 36.7 56.5 4 72.1 50.7 23.2 47.6 -4.8 37.3 31.9 24.3 75.2 21.5 59.3 17.2 54.6 23.6 75.5 22.2 60.8 29.3 51 14 63.4 0.2 102.7 30.7 48.2 10.3 74.8 26.5 48.2 23.1 87 Verify that the correlation is significant at an ?=0.05?=0.05. If the correlation is indeed significant, predict what value (on average) for the...
14. The regression equation and the standard error of estimate Stewart Fleishman specializes in the psychiatric...
14. The regression equation and the standard error of estimate Stewart Fleishman specializes in the psychiatric aspects of symptom management in cancer patients. Pain, depression, and fatigue can appear as single symptoms, in conjunction with one other symptom, or all together in patients with cancer. You are interested in testing a new kind of exercise therapy for the treatment of the simultaneous clustering of fatigue and depression in cancer patients. The following scores represent the decrease in symptom intensity (on...
Why do we need to use standard errors to estimate the standard deviations of regression coefficients?
Why do we need to use standard errors to estimate the standard deviations of regression coefficients?
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT