Shiva Prakash
Shiva Prakash

Reputation: 1919

KerasTuner Custom Objective Function

I am trying to use Keras Tuner for my hyper-parameter fine tuning. I would like to maximize auc. Can anyone help me with using kerastuner.Objective for a custom metric ?

EXECUTIONS_PER_TRIAL = 5

b_tuner = BayesianOptimization(
    tune_nn_model,
    objective='val_binary_accuracy',
    max_trials=MAX_TRIALS,
    executions_per_trial=EXECUTIONS_PER_TRIAL,
    directory='test_dir101897',
    project_name='b_tune_nn',
    seed=12347
)

I tried defining a custom function like :

from sklearn import metrics
from keras import backend as K

def auc(y_true, y_pred):
    auc = tf.metrics.auc(y_true, y_pred)[1]
    K.get_session().run(tf.local_variables_initializer())
    return auc

and plug it in

objective='val_auc'

But this does not work

Upvotes: 8

Views: 8227

Answers (1)

Ramin Zandvakili
Ramin Zandvakili

Reputation: 179

Thanks to the GitHub page provided above by @Shiva I tried this to get the AUC for the validation data with the Keras tuner, and it worked. My model is an LSTM, and I have made the MyHyperModel class to be able to tune the batch_size as described here. You don't have to do this if you want to use a fixed batch_size. You can uncomment any of the other metrics and do the regularization based on them in the same way.

# make X_train, y_train, X_valid, y_valid
mask_value=-9999.99
epochs=200

class MyHyperModel(kt.HyperModel):
  def build(self, hp):
    hp_lstm_units = hp.Int('units', min_value=16, max_value=128, step=16)
    hp_dropout_rate = hp.Float('drop_out_rate', min_value=0, max_value=0.6)
    hp_recurrent_dropout_rate = hp.Float('recurrent_dropout_rate', min_value=0, max_value=0.6)
    hp_initial_learning_rate = hp.Float('initial_learning_rate',  min_value=1e-3, max_value=1e-1, sampling='log')
    hp_decay = hp.Int('decay', min_value=10, max_value=100, step=10 )

    # model
    model = tf.keras.Sequential()

    model.add(tf.keras.layers.Masking(mask_value=mask_value, input_shape = (X_train.shape[1], X_train.shape[2])))
    model.add(tf.keras.layers.LSTM(hp_lstm_units,
    dropout=hp_dropout_rate, recurrent_dropout=hp_recurrent_dropout_rate))
    model.add(tf.keras.layers.Dense(1, activation='sigmoid'))
    model.compile(loss=tf.keras.losses.BinaryCrossentropy(from_logits=False), 
                  optimizer=keras.optimizers.SGD(learning_rate=hp_initial_learning_rate, decay=hp_decay), 
                  metrics=[
                      # tf.keras.metrics.TruePositives(name='tp'),
                      # tf.keras.metrics.FalsePositives(name='fp'),
                      # tf.keras.metrics.TrueNegatives(name='tn'),
                      # tf.keras.metrics.FalseNegatives(name='fn'),
                      # tf.keras.metrics.BinaryAccuracy(name='accuracy'),
                      # tf.keras.metrics.Precision(name='precision'),
                      # tf.keras.metrics.Recall(name='recall'),
                      tf.keras.metrics.AUC(name='auc'),
                  ])
    return model

    def fit(self, hp):
      hp_batch_size = hp.Int('batch_size', min_value=8, max_value=128, step=8)
      return model.fit(
          *args,
          batch_size=hp_batch_size,
          **kwargs)
      

tuner = kt.BayesianOptimization(
    MyHyperModel(),
    objective=kt.Objective('val_auc', direction='max'),
    overwrite=True,
    max_trials=100,
    directory="MyDirectory",
    project_name="MyProject",
)

tuner.search(X_train, y_train, epochs=200, validation_data=(X_valid, y_valid))

Upvotes: 2

Related Questions