Reputation: 31
I want to gradually increase a coefficient in a Keras model, that is used in calculating the loss. The variable value is based on current epoch. However, when I want to set the value, I get the following error:
float object has no attribute dtype
My code:
def warm_up(epoch, logs):
new_value= tf.keras.backend.variable(np.array(1.0, dtype=np.float32), dtype=tf.float32)
tf.keras.backend.set_value(model.variable1, new_value)
callback = tf.keras.callbacks.LambdaCallback(on_epoch_begin=warm_up)
model.fit(..., callbacks = [callback])
How can I change a variable in a custom Keras model during training? I am using Tensorflow 2.2.
Traceback:
\Anaconda3\lib\site-packages\tensorflow\python\keras\engine\training.py in _method_wrapper(self, *args, **kwargs)
64 def _method_wrapper(self, *args, **kwargs):
65 if not self._in_multi_worker_mode(): # pylint: disable=protected-access
---> 66 return method(self, *args, **kwargs)
67
68 # Running inside `run_distribute_coordinator` already.
~\Anaconda3\lib\site-packages\tensorflow\python\keras\engine\training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
836 for epoch, iterator in data_handler.enumerate_epochs():
837 self.reset_metrics()
--> 838 callbacks.on_epoch_begin(epoch)
839 with data_handler.catch_stop_iteration():
840 for step in data_handler.steps():
~\Anaconda3\lib\site-packages\tensorflow\python\keras\callbacks.py in on_epoch_begin(self, epoch, logs)
347 logs = self._process_logs(logs)
348 for callback in self.callbacks:
--> 349 callback.on_epoch_begin(epoch, logs)
350 self._reset_batch_timing()
351
c:\Users\..\training.py in warm_up(epoch, logs)
379 def warm_up(epoch, logs):
380 test = tf.keras.backend.variable(np.array(1.0, dtype=np.float32), dtype=tf.float32)
--> 381 tf.keras.backend.set_value(model.variable1, test)
382
383
~\Anaconda3\lib\site-packages\tensorflow\python\keras\backend.py in set_value(x, value)
3349 (of the same shape).
3350 """
-> 3351 value = np.asarray(value, dtype=dtype(x))
3352 if ops.executing_eagerly_outside_functions():
3353 x.assign(value)
~\Anaconda3\lib\site-packages\tensorflow\python\keras\backend.py in dtype(x)
1266
1267 """
-> 1268 return x.dtype.base_dtype.name
1269
1270
AttributeError: 'float' object has no attribute 'dtype'
EDIT: I changed my code to the following:
class LossCallback(tf.keras.callbacks.Callback):
def __init__(self):
super(LossCallback, self).__init__()
self.model.beta_x = tf.Variable(1.0, trainable=False, name='weight1', dtype=tf.float32)
def on_epoch_begin(self, epoch, logs=None):
tf.keras.backend.set_value(self.model.beta_x, tf.constant(0.5) * epoch)
def on_epoch_end(self, epoch, logs=None):
logs = logs or {}
logs['beta_x'] = tf.keras.backend.get_value(self.model.beta_x)
I still get an error in the on_epoch_begin
: 'NoneType' object has no attribute 'beta_x'
.
Upvotes: 2
Views: 3322
Reputation: 938
Avoid directly editing the variables. You must access keras variables like this
import tensorflow as tf
from tensorflow import keras
import numpy as np
def warm_up(epoch, logs):
val = keras.backend.get_value(model.optimizer.lr)
val *= 1.1
tf.keras.backend.set_value(model.optimizer.lr, val)
callback = tf.keras.callbacks.LambdaCallback(on_epoch_begin=warm_up)
model = tf.keras.models.Sequential([
keras.layers.Dense(10, 'relu'),
keras.layers.Dense(1, 'sigmoid')
])
model.compile(loss='binary_crossentropy')
X_train = tf.random.uniform((10,10))
y_train = tf.ones((10,))
model.fit(X_train, y_train,
callbacks = [callback])
Notice how I get the current value like val = keras.backend.get_value(model.optimizer.lr)
. This is correct way to get the correct value during runtime.
Also, Do not use or declare new variables inside of a loop. You probably can get a new_value
from reading and changing the old one.
Also, avoid using any other library aside from tensorflow inside of callbacks, especially if your callbacks are going to get called often. Don't use numpy, use tensorflow. There is virtually always a tensorflow opertaion that does what you need.
Edit: If you have some custom value to update, you can use a pattern like this:
class LossCallback(tf.keras.callbacks.Callback):
def __init__(self):
super(LossCallback, self).__init__()
self.someValue = tf.Variable(1.0, trainable=False, name='weight1', dtype=tf.float32)
def on_epoch_end(self, epoch, logs=None):
tf.keras.backend.set_value(self.model.loss.someValue, self.someValue * epoch)
Or you could still try to use a lambda callback.
From within the callback you can access any variable of the model. Like this self.model.someVariable
. You can also access any custom variables that are defined in your model's custom __init__
function like this:
#in model's custom __init__
def __init__(self, someArgs):
...
self.someArg = someArgs
...
#in callback's "on_epoch_..." method
...
keras.backend.set_value(self.model.someArg, 42)
...
Note that you can't use self.model
in callback's __init__
function as the model is still uninitialized when the callback's __init__
is called.
Does this help?
Upvotes: 0
Reputation: 31
EDIT 2: When I initialize my model first and add it as an extra parameter to the callback method it works. So the solution is as follows:
class LossCallback(tf.keras.callbacks.Callback):
def __init__(self, model):
super(LossCallback, self).__init__()
model.beta_x = tf.Variable(1.0, trainable=False, name='weight1', dtype=tf.float32)
def on_epoch_begin(self, epoch, logs=None):
tf.keras.backend.set_value(self.model.beta_x, tf.constant(0.5) * epoch)
def on_epoch_end(self, epoch, logs=None):
logs = logs or {}
logs['beta_x'] = tf.keras.backend.get_value(self.model.beta_x)
model = create_model() # initialize custom keras model
callback = LossCallback(model)
model.fit(..., callbacks=[callback])
Thanks to @tornikeo for the great help!
Upvotes: 1