Eli Hektor
Eli Hektor

Reputation: 79

What is the difference between tf.train.AdamOptimizer and use adam in keras.compile?

i was building a dense neural network for predicting poker hands. First i had a problem with the reproducibility, but then i discovered my real problem: That i can not reproduce my code is because of the adam-optimizer, because with sgd it worked. This means

model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

did NOT work, whereas

opti = tf.train.AdamOptimizer()
model.compile(loss='sparse_categorical_crossentropy', optimizer=opti, metrics=['accuracy'])

worked with reproducibility. So my question is now: Is there any difference using

tf.train.AdamOptimizer

and

model.compile(..., optimizer = 'adam')

because i would like to use the first one because of the reproduce-problem.

Upvotes: 2

Views: 1437

Answers (1)

Hadok 361
Hadok 361

Reputation: 64

They both are the same. However, in the tensorflow.train.AdamOptimizer you can change the learning rate

tf.compat.v1.train.AdamOptimizer(
    learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-08, use_locking=False,
    name='Adam')

which will improve the learning performance and the training would take longer. but in the model.compile(optimizer="adam") it will set the learning rate, beta1, beta2...etc to the default settings

Upvotes: 2

Related Questions