Reputation: 21
I'm trying to tune my model but I'm getting this Value Error. I tried to change the activation function, but when I did the learning rate returned the same error. I'm not sure if I'm missing something.
>ValueError Traceback (most recent call last)
> <ipython-input-46-5d07e2ad456a> in <module>
> 9 param_distributions = params,
> 10 cv = KFold(10))
> --->11 random_search_results = random_search.fit(X_train, y_train)
ValueError: activation is not a legal parameter
def create_model(learning_rate=0.01):
opt = 'Adam'
Tuning_model = Sequential()
Tuning_model.add(Dense(16, input_shape=(X_train.shape[1],)))
Tuning_model.add(Dropout(.2))
Tuning_model.add(BatchNormalization())
Tuning_model.add(Activation('relu'))
Tuning_model.add(Dense(32))
Tuning_model.add(Dropout(.2))
Tuning_model.add(Dense(1))
Tuning_model.compile(loss='mse', optimizer=opt, metrics='mse')
return Tuning_model
# Define the hyperparameter space
params = {'activation': ["relu", "tanh"],
'batch_size': [16, 32, 64, 128],
'epochs': [50, 100],
'optimizer': ["Adam", "SGD", "RMSprop"],
'learning_rate': [0.01, 0.001, 0.0001]}
# Create a randomize search cv object
random_search = RandomizedSearchCV(Tuning_model,
param_distributions = params,
cv = KFold(10))
random_search_results = random_search.fit(X_train, y_train)
Upvotes: 2
Views: 1002
Reputation: 4893
The ValueError is raised because activation
is not a parameter of the whole model, but rather a parameter of certain of its' layer(s). So when RandomizedSearchCV
tries to pass it, Model
object can't accept it.
I suggest 2 solutions:
KerasClassifier
wrapper around build function and make activation one of its. Then optimize its performance with RandomizedSearchCV.optuna
- it works smarter and has greater degree of customization. Try it - they have good and easy docs on their site.Side note: RandomizedSearchCV with 10 folds is an overkill, if sample is large enough, make it 2 or even single fold.
Upvotes: 4