Reputation: 2234
I cannot seem to get the value of learning rate. What I get is below.
I've tried the model for 200 epochs and want to see/change the learning rate. Is this not the correct way?
>>> print(ig_cnn_model.optimizer.lr)
<tf.Variable 'lr_6:0' shape=() dtype=float32_ref>
Upvotes: 41
Views: 56178
Reputation: 316
If you're using a learning rate schedule in tf2 and want to access the learning rate while the model is training, you can define a custom callback. This is an example for a callback which prints the learning rate at every epoch:
from tensorflow.keras.callbacks import Callback
class PrintLearningRate(Callback):
def __init__(self):
pass
def on_epoch_begin(self, epoch, logs=None):
lr = K.eval(self.model.optimizer._decayed_lr(tf.float64)
print("\nLearning rate at epoch {} is {}".format(epoch, lr)))
Notice how for learning rate schedulers, in tf2 the learning rate can be accessed via _decayed_lr()
.
Upvotes: 2
Reputation: 66261
With Tensorflow >=2.0:
In [1]: import tensorflow as tf
In [2]: opt = tf.keras.optimizers.Adam()
In [3]: opt.lr.numpy()
Out[3]: 0.001
lr
is just a tf.Variable
, so its value can be changed via assign()
method:
In [4]: opt.lr.assign(0.1)
Out[4]: <tf.Variable 'UnreadVariable' shape=() dtype=float32, numpy=0.1>
In [5]: opt.lr.numpy()
Out[5]: 0.1
Same goes with the rest of hyperparameters:
In [6]: opt.decay.numpy()
Out[6]: 0.0
In [7]: opt.beta_1.numpy()
Out[7]: 0.9
In [8]: opt.beta_2.numpy()
Out[8]: 0.999
Upvotes: 3
Reputation: 1200
You can change your learning rate by
from keras.optimizers import Adam
model.compile(optimizer=Adam(lr=0.001),
loss='categorical_crossentropy',
metrics=['accuracy'])
Upvotes: 19
Reputation: 13498
Use eval()
from keras.backend
:
import keras.backend as K
from keras.models import Sequential
from keras.layers import Dense
model = Sequential()
model.add(Dense(1, input_shape=(1,)))
model.add(Dense(1))
model.compile(loss='mse', optimizer='adam')
print(K.eval(model.optimizer.lr))
Output:
0.001
Upvotes: 58
Reputation: 14072
Some of the optimizers don't include their names in the configs.
Here is a complete example on how to get the configs and how to reconstruct (i.e. clone) the optimizer from their configs (which includes the learning rate as well).
import keras.optimizers as opt
def get_opt_config(optimizer):
"""
Extract Optimizer Configs from an instance of
keras Optimizer
:param optimizer: instance of keras Optimizer.
:return: dict of optimizer configs.
"""
if not isinstance(optimizer, opt.Optimizer):
raise TypeError('optimizer should be instance of '
'keras.optimizers.Optimizer '
'Got {}.'.format(type(optimizer)))
opt_config = optimizer.get_config()
if 'name' not in opt_config.keys():
_name = str(optimizer.__class__).split('.')[-1] \
.replace('\'', '').replace('>', '')
opt_config.update({'name': _name})
return opt_config
def clone_opt(opt_config):
"""
Clone keras optimizer from its configurations.
:param opt_config: dict, keras optimizer configs.
:return: instance of keras optimizer.
"""
if not isinstance(opt_config, dict):
raise TypeError('opt_config must be a dict. '
'Got {}'.format(type(opt_config)))
if 'name' not in opt_config.keys():
raise ValueError('could not find the name of optimizer in opt_config')
name = opt_config.get('name')
params = {k: opt_config[k] for k in opt_config.keys() if k != 'name'}
if name.upper() == 'ADAM':
return opt.Adam(**params)
if name.upper() == 'NADAM':
return opt.Nadam(**params)
if name.upper() == 'ADAMAX':
return opt.Adamax(**params)
if name.upper() == 'ADADELTA':
return opt.Adadelta(**params)
if name.upper() == 'ADAGRAD':
return opt.Adagrad(**params)
if name.upper() == 'RMSPROP':
return opt.RMSprop()
if name.upper() == 'SGD':
return opt.SGD(**params)
raise ValueError('Unknown optimizer name. Available are: '
'(\'adam\',\'sgd\', \'rmsprop\', \'adagrad\', '
'\'adadelta\', \'adamax\', \'nadam\'). '
'Got {}.'.format(name))
if __name__ == '__main__':
rmsprop = opt.RMSprop()
configs = get_opt_config(rmsprop)
print(configs)
cloned_rmsprop = clone_opt(configs)
print(cloned_rmsprop)
print(cloned_rmsprop.get_config())
{'lr': 0.0010000000474974513, 'rho': 0.8999999761581421, 'decay': 0.0, 'epsilon': 1e-07, 'name': 'RMSprop'}
<keras.optimizers.RMSprop object at 0x7f96370a9358>
{'lr': 0.0010000000474974513, 'rho': 0.8999999761581421, 'decay': 0.0, 'epsilon': 1e-07}
Upvotes: 1
Reputation: 735
The best way to get all information related to the optimizer would be with .get_config()
.
Example:
model.compile(optimizer=optimizerF,
loss=lossF,
metrics=['accuracy'])
model.optimizer.get_config()
>>> {'name': 'Adam', 'learning_rate': 0.001, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
It returns a dict with all information.
Upvotes: 26
Reputation: 2406
An alternate way:
opt = keras.optimizers.SGD()
print('learning rate={}'.format(opt.lr.numpy()))
model.compile(optimizer = opt, ...)
Upvotes: 1