Scikit-learn Ridge Regression with Irregular Intercept

Does the ridge regression using a skikit that includes an intercept coefficient in the regularization term, and if so, is there a way to run the ridge regression without intercept regularization?

Suppose I agree with ridge regression:

from sklearn import linear_model

mymodel = linear_model.Ridge(alpha=0.1, fit_intercept=True).fit(X, y)
print mymodel.coef_
print mymodel.intercept_

      

for some data X, y, where X does not contain a column of 1. fit_intercept = True will automatically add an intercept column, and the corresponding coefficient will be specified by mymodel.intercept_. I can't figure out if this intercept ratio is part of the regularization summation in the optimization goal.

According to http://scikit-learn.org/stable/modules/linear_model.html the optimization goal is to minimize with respect to w:

|| X * w - y || ** 2 + alpha * || w || ** 2

(using the L2 norm). The second term is a regularization term, and the question is whether it includes the intercept factor when we set fit_intercept = True; and if so how to disable it.

+3


source to share


1 answer


Interception is not penalized. Just try a simple 3 point example with a lot of interception.

from sklearn import linear_model
import numpy as np

x=np.array([-1,0,1]).reshape((3,1))
y=np.array([1001,1002,1003])
fit=linear_model.Ridge(alpha=0.1,fit_intercept=True).fit(x,y)

print fit.intercept_
print fit.coef_

      



Intercept was set to intercept MLE (1002) and tilt was penalized (0.952 instead of 1).

+3


source







All Articles