In: Statistics and Probability
Importance of R Square Value in Simple Linear Regression. What happens when you increase number of controlled variables? Is it good to increase the number of controlled variable? If yes, why and if not why not? Be specific on your discussion.
give thumps up if you are satisfied else comment...
R-squared is a goodness-of-fit measure for linear regression
models. This statistic indicates the percentage of the variance in
the dependent variable that the independent variables explain
collectively. After fitting a linear regression model, you need to
determine how well the model fits the data.
R-squared evaluates the scatter of the data points around the
fitted regression line. It is also called the coefficient of
determination, or the coefficient of multiple determination for
multiple regression. For the same data set, higher R-squared values
represent smaller differences between the observed data and the
fitted values.
R-squared is the percentage of the dependent variable variation that a linear model explains.
R-squared is always between 0 and 100%.
when we increase the controlled variable the R square value is increased, it is not good..
R Square is a basic matrix which tells you about that how much variance is been explained by the model. What happens in a multivariate linear regression is that if you keep on adding new variables, the R square value will always increase irrespective of the variable significance. What adjusted R square do is calculate R square from only those variables whose addition in the model which are significant. So always while doing a multivariate linear regression we should look at adjusted R square instead of R square.