When removing intercepts the adjusted r-squared should increase and the SSE should decrease. What if the adjusted r-squared and SSE increases? For example if removing intercept A increases the adjusted r-squared and decreases the SSE then removing intercept B increases the adjusted r-squared and SSE; is the first model by only removing intercept A the better model?

# creating best linear regression model

Hi,

I'm no expert in statistics, and I think this might not be the best place to ask this question, but I found a nice post that explains in great detail the caveats when using methods like this in R as the calculations done by the function change if you manually remove intercepts:

Don't know if this will answer your questions, but at least it's a starter

PJ

I'm afraid that I do not agree with this claim. A model with intercept is a bigger class than a model without one, since a model like Y = X\beta + \epsilon can always be considered as a special case of the model Y = \beta_0 + X\beta + \epsilon, with \beta_0 being `0`

. Since minimum over a set always less than minimum over a subset of that set, SSE of the model with intercept should be smaller than the SSE of the model without intercept.

Also, I don't follow what you mean here. As far as I know, a model can not have two intercepts, since it will lead to non-identifiability. If you have two intercepts `A`

and `B`

in the model, the intercept will actually be `A + B`

, and you cannot estimate `A`

and `B`

uniquely only from the estimate of `A + B`

.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.