Cosmic
Cosmic

Reputation: 123

I can't add optimizer parameter in gridsearch

from keras.wrappers.scikit_learn import KerasClassifier
from sklearn.model_selection import GridSearchCV
def build_classifier():
  classifier = Sequential()
  classifier.add(Dense(units = 6 , init='uniform' , activation= 'relu'))
  classifier.add(Dense(units = 6 , init='uniform' , activation= 'relu'))
  classifier.add(Dense(units = 1 , init='uniform' , activation= 'sigmoid'))
  classifier.compile(optimizer='adam' , loss = 'binary_crossentropy' , 
  metrics=['accuracy'])
  return classifier
KC = KerasClassifier(build_fn=build_classifier)
parameters = {'batch_size' : [25,32],
          'epochs' : [100,500],
          'optimizer':['adam','rmsprop']}
grid_search = GridSearchCV(estimator=KC , 
param_grid=parameters,scoring='accuracy',cv=10)
grid_search.fit(X_train,y_train)

I wanna test the model with different optimizer. But I can't seem to add optimizer in grid search. Whenever I run the program, it shows error regarding to fitting the training set.

ValueError: optimizer is not a legal parameter

Upvotes: 12

Views: 12863

Answers (3)

Vivek Kumar
Vivek Kumar

Reputation: 36619

The documentation of keras for scikit-learn says:

sk_params takes both model parameters and fitting parameters. Legal model parameters are the arguments of build_fn. Note that like all other estimators in scikit-learn, build_fn should provide default values for its arguments, so that you could create the estimator without passing any values to sk_params.

GridSearchCV will call get_params() on KerasClassifier to get a list of valid parameters that can be passed to it which according to your code:

KC = KerasClassifier(build_fn=build_classifier)

will be empty (since you are not specifying any parameters in the build_classifier).

Change that to something like:

# Used a parameter to specify the optimizer
def build_classifier(optimizer = 'adam'):
  ...
  classifier.compile(optimizer=optimizer , loss = 'binary_crossentropy' , 
  metrics=['accuracy'])
  ...
  return classifier

After that it should work.

Upvotes: 18

Karuppaiya
Karuppaiya

Reputation: 36

# Function to create model, required for KerasClassifier
def create_model( optimizer='adam'):
    model = Sequential()
    model.add(Dense(150, input_dim=13, activation='relu'))
    model.add(Dense(80, activation='relu'))
    model.add(Dense(2, activation='softmax'))
    model.compile(loss='categorical_crossentropy', optimizer='adam', metrics= 
    ['accuracy'])
    return model
    

# create model
model = KerasClassifier(build_fn=create_model, verbose=0)
# define the grid search parameters
batch_size = [10, 20]
epochs = [10, 50]
optimizer = ['adam','rmsprop']
param_grid = dict(optimizer=optimizer,batch_size=batch_size, epochs=epochs)
grid = GridSearchCV(estimator=model, param_grid=param_grid, n_jobs=-1, cv=3)
grid_result = grid.fit(X, y)

First optimizer=optimizer, second batch_size=batch_size and last epochs=epochs.

Upvotes: 1

I think it will be solved if you add optimizer = 'adam' as your argument for your build_classifier then optimizer=optimizer as compile parameter

from keras.wrappers.scikit_learn import KerasClassifier
from sklearn.model_selection import GridSearchCV
def build_classifier(**optimizer='adam'):
  classifier = Sequential()
  classifier.add(Dense(units = 6 , init='uniform' , activation= 'relu'))
  classifier.add(Dense(units = 6 , init='uniform' , activation= 'relu'))
  classifier.add(Dense(units = 1 , init='uniform' , activation= 'sigmoid'))
  classifier.compile(optimizer=optimizer , loss = 'binary_crossentropy' , 
  metrics=['accuracy'])
  return classifier
KC = KerasClassifier(build_fn=build_classifier)
parameters = {'batch_size' : [25,32],
          'epochs' : [100,500],
          'optimizer':['adam','rmsprop']}
grid_search = GridSearchCV(estimator=KC , 
param_grid=parameters,scoring='accuracy',cv=10)
grid_search.fit(X_train,y_train)

Upvotes: 1

Related Questions