In: Statistics and Probability
If I ran a multivariate regression analysis for the effect of independent variables X and Y on dependent variable A, that produced an adjusted R^2 of .0553, then added the independent variable Z to the analysis and got an adjusted R^2 of .0550, would that decrease in the adjusted R^2 translate to the independent variable Z not being a strong predictor of the dependent variable A? If it were a strong predictor of A would the adjusted R^2 increase?
Adjusted R squared value often decreases when a new predictor variable is added. When you add a new independent variable to the model, the adjusted R squared value will dip if the new variable is not able to improve the existing model by more than expected by chance.
It is possible that a you add a totally unrelated variable but it still happens to improve the exsiting model as matter of pure luck. So in regression analysis, in order to rule out this chance happening, the addition of the new variable must overcome this 'chance improvement'. If it does not, then the adjusted R squared value shows a dip.
If the new variable is actually a very strong predictor of A in terms of linear regression, then the adjusted R squared value is bound to increase.
It can also sometimes happen that a variable is a good predictor individually, but when taken in combination it causes a decrease in adjusted R squared value, because it set the bar low since the other variables were far better predictors than the one you added.