Florian Lalande
Florian Lalande

Reputation: 639

How to check the learning rate with train_on_batch [Keras]

I am using Keras on Python2. Does anyone know how to check and modify the learning rate for the ADAM optimizer please ? Here is my neural network and I defined my own optimizer. When training on batches with model.train_on_batch(...) I have no way to track the learning rate. Thanks for your help

def CNN_model():
    # Create model
    model = Sequential()
    model.add(Conv2D(12, (5, 5), input_shape=(1, 256, 256), activation='elu'))
    model.add(MaxPooling2D(pool_size=(3, 3)))
    model.add(Conv2D(12, (5, 5), activation='elu'))
    model.add(MaxPooling2D(pool_size=(4, 4)))
    model.add(Conv2D(12, (3, 3), activation='elu'))
    model.add(MaxPooling2D(pool_size=(3, 3)))
    model.add(Flatten())
    model.add(Dropout(0.3))
    model.add(Dense(128, activation='elu'))
    model.add(Dropout(0.3))
    model.add(Dense(32, activation='elu'))
    model.add(Dense(2, activation='softmax'))
    # Compile model
    my_optimizer = Adam(lr=0.001, decay=0.05)
    model.compile(loss='categorical_crossentropy', optimizer=my_optimizer, metrics=['accuracy'])
    return model

Upvotes: 0

Views: 2234

Answers (2)

Ioannis Nasios
Ioannis Nasios

Reputation: 8527

You can use ReduceLROnPlateau callback. On your callbacks list add ReduceLROnPlateau callback and then just include your callback list to your train scheme.

from keras.callbacks import ModelCheckpoint, ReduceLROnPlateau
callbacks= [ReduceLROnPlateau(monitor='val_acc', 
                                        patience=5, 
                                        verbose=1, 
                                        factor=0.5, 
                                        min_lr=0.00001)]
model=CNN_model()
model.fit(x_train, y_train, batch_size=batch_size,
           epochs=epochs,
           validation_data=(x_valid, y_valid),
           callbacks = callbacks)

Upvotes: 0

pitfall
pitfall

Reputation: 2621

You can do it in several ways. The simplest thing in my mind is to do it through callbacks

from keras.callbacks import Callback
from keras import backend as K
class showLR( Callback ) :
    def on_epoch_begin(self, epoch, logs=None):
        lr = float(K.get_value(self.model.optimizer.lr))
        print " epoch={:02d}, lr={:.5f}".format( epoch, lr )

Upvotes: 3

Related Questions