Reputation: 1
this is my first question here. I'm playing with tensorflow.keras, doing some CNNs, and I would like to know if anyone understands why this conflict arises, thanks.
from tensorflow.keras.optimizers import Nadam
from tensorflow.keras.optimizers.schedules import ExponentialDecay
initial_learning_rate = 0.1
lr_schedule = ExponentialDecay(
initial_learning_rate,
decay_steps=100000, decay_rate=0.96, staircase=True)
model.compile(optimizer=Nadam(learning_rate=lr_schedule), loss='categorical_crossentropy', metrics=['accuracy'])
Upvotes: 0
Views: 589
Reputation:
This ValueError: The Nadam optimizer does not support tf.keras.optimizers.LearningRateSchedules as the learning rate
is caused because Nadam optimzer does not support LearningRateSchedule as other optimzers do.
You can use other optimizers except Nadam
which supports schedules.
Upvotes: 1