Mauro Gentile
Mauro Gentile

Reputation: 1511

Keras equivalent of a tensorflow optimizer

I need to translate a NN from TF into Keras. Kind of straightforward, except for the "momentum" optimizer in Tensorflow.

My best guess is that "momentum" in TF would be SGD in Keras. Is this correct?

if so, what are the default hyparparameters "lr, momentum, decay, nesterov" I should set in Keras to match the default call "optimizer=momentum" in TF?

Thank you!

Line to translate:

network = regression(
network,
optimizer='momentum',
loss='categorical_crossentropy'
)

Upvotes: 1

Views: 719

Answers (2)

Lescurel
Lescurel

Reputation: 11631

It seems that the TensorFlow code that you're trying to translate is using the high level API TF learn.

By default, the Momentum object from TF learn is initialized with these values :

def __init__(self, learning_rate=0.001, momentum=0.9, lr_decay=0.,
             decay_step=100, staircase=False, use_locking=False,
             name="Momentum"):

It also doesn't use the Nesterov momentum. See the github repo for more information.

To translate it in Keras, I would use :

#define all your layers in the network variable
momentum = keras.optimizers.SGD(lr=0.001, momentum=0.9, decay=0., nesterov=False)
network.compile(loss='categorical_crossentropy', optimizer=momentum)

Upvotes: 2

sdcbr
sdcbr

Reputation: 7129

Check out the Keras docs on optimizers.

You have two options: either you define an optimizer by name when compiling the model, in which case the default options will apply:

model.compile(loss=..., optimizer='sgd')

Or you instantiate an Optimizer object separately and specify additional options before passing it to model.compile():

from keras import optimizers
optimizer = optimizers.SGD(lr=..., momentum=...) # specify options such as momentum here
model.compile(loss=..., optimizer=optimizer)

Upvotes: 0

Related Questions