Noura
Noura

Reputation: 474

Increase epochs number when the learning rate decreases

I am using :

ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=50)

How can I increase the epoch’s number when I decrease the learning rate value.

I am looking to have patience=50 when the lr = 0.2 at the beginning, and a highest number of epochs as the learning rate decreases (for example patience =100 when lr=0.02, patience =1000 when lr=0.002) to give more time to the algorithm when the lr is small.

Upvotes: 1

Views: 207

Answers (1)

McGuile
McGuile

Reputation: 828

I believe this is possible by implementing your own ReduceLROnPlateau class by essentially copying over Keras' code and modifying it.

This is the class you would copy over and modify.

Change the class signature to:

class ReduceLROnPlateau(Keras.callbacks.Callback)

then look for the lines where the wait time is compared against the patience, and where the LR is reduced. Modify those lines to set your patience longer when LR reaches a certain value.

Finally, use this class in your callbacks instead of Keras' ReduceLROnPlateau.

Upvotes: 1

Related Questions