garbage_collector
garbage_collector

Reputation: 103

Exponential decay learning rate parameters of Adam optimizer in Keras

Consider the following information:

My problem is to choose the decay step in such a way that the decay occurs every two epochs. How can I fix this in Keras?

This is the formula of the exponential decay learning rate:

click here to view the image

Upvotes: 3

Views: 3085

Answers (1)

benbotto
benbotto

Reputation: 2440

Seems like the ExponentialDecay LearningRateScheduler could be used. To decay every two epochs, the decay_steps should be num_steps_per_epoch * 2. Also provide the staircase parameter as True so that the learning rate decays discretely.

Something like this (I didn't run this code):

initial_learning_rate = 0.0002
steps_per_epoch = ...
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
    initial_learning_rate,
    decay_steps=steps_per_epoch * 2,
    decay_rate=0.7,
    staircase=True)

Then pass lr_schedule to Adam using the learning_rate parameter.

Upvotes: 2

Related Questions