Pulkit Pareek
Pulkit Pareek

Reputation: 1

ValueError: Could not interpret optimizer identifier: <keras.src.optimizers.adam.Adam object at 0x336f637d0>

I am trying to use PolynomialDecay to handle learning rate in Adam optimizer in fine tuning a transformer model. But I am getting this error :

"ValueError: Could not interpret optimizer identifier: <keras.src.optimizers.adam.Adam object at 0x336f637d0>"

I am using tensorflow 2.16.1 and keras 3.8.

from tensorflow.keras.optimizers.schedules import PolynomialDecay
from tensorflow.keras.losses import SparseCategoricalCrossentropy
from tensorflow.keras.optimizers import Adam
num_epochs = 3
num_train_steps = len(tf_train_dataset) * num_epochs

num_train_steps = len(tf_train_dataset) * num_epochs

lr_scheduler = PolynomialDecay(initial_learning_rate=5e-5,
                              end_learning_rate=0.0,
                              decay_steps=num_train_steps,
                              power=2)

opt = Adam(learning_rate=lr_scheduler)
loss = SparseCategoricalCrossentropy(from_logits=True)
model.compile(optimizer=opt, loss=loss, metrics=["accuracy"])

Getting error when I compile the Model.

From keras and tensorflow documentation, above code should work. Can anyone please tell me what I am missing here ??

Upvotes: 0

Views: 48

Answers (0)

Related Questions