Jalagandeswaran r
Jalagandeswaran r

Reputation: 17

How to run GridsearchCV with Ridge regression in sklearn

I am importing GridsearchCV from sklearn to do this. I don't know what values I should give in array in the parameters:

Parameters={'alpha':[array]}
Ridge_reg=GridsearchCV (ridge,parameters,scoring='neg mean squared error',cv=5)
  1. Is this correct?
  2. How to see the ridge regression graph?

Upvotes: 1

Views: 17409

Answers (1)

seralouk
seralouk

Reputation: 33127

The code that you posted has multiple syntactic errors e.g. GridsearchCV and scoring='neg mean squared error'.

The first input argument should be an object (model).

Use this:

from sklearn.linear_model import Ridge
import numpy as np
from sklearn.model_selection import GridSearchCV

n_samples, n_features = 10, 5
rng = np.random.RandomState(0)
y = rng.randn(n_samples)
X = rng.randn(n_samples, n_features)

parameters = {'alpha':[1, 10]}

# define the model/ estimator
model = Ridge()

# define the grid search
Ridge_reg= GridSearchCV(model, parameters, scoring='neg_mean_squared_error',cv=5)

#fit the grid search
Ridge_reg.fit(X,y)

# best estimator
print(Ridge_reg.best_estimator_)

# best model
best_model = Ridge_reg.best_estimator_
best_model.fit(X,y)
...
...

For the visualization (Ridge coefficients as a function of the regularization):

import matplotlib.pyplot as plt

alphas = [1, 10]
coefs = []
for a in alphas:
    ridge = Ridge(alpha=a, fit_intercept=False)
    ridge.fit(X, y)
    coefs.append(ridge.coef_)

ax = plt.gca()
ax.plot(alphas, coefs)
ax.set_xscale('log')
ax.set_xlim(ax.get_xlim()[::-1])  # reverse axis
plt.xlabel('alpha')
plt.ylabel('weights')
plt.title('Ridge coefficients as a function of the regularization')
plt.axis('tight')
plt.show()

enter image description here

Upvotes: 4

Related Questions