Reputation: 7267
I'm defining a custom loss function. For eg, let's take the loss function = L1 loss + L2 loss.
When I do model.fit_generator()
, the overall loss function is printed after every batch. But I want to see the individual values of L1 loss
and L2 loss
. How can I do this?
I want to know the value of individual terms to know their relative scales.
tf.print(l1_loss, output_stream=sys.stdout)
is throwing an exception saying tensorflow.python.eager.core._FallbackException: This function does not handle the case of the path where all inputs are not already EagerTensors.
.
Even tf.print('---')
just prints ---
at the beginning and not for every batch.
tf.keras.backend.print_tensor(l1_loss)
is not printing anything
Upvotes: 4
Views: 2185
Reputation: 3876
Without seeing your code, I can only guess that you didn't decorate your custom loss function with the @tf.function
decorator.
import numpy as np
import tensorflow as tf
@tf.function # <-- Be sure to use this decorator.
def custom_loss(y_true, y_pred):
loss = tf.reduce_mean(tf.math.abs(y_pred - y_true))
tf.print(loss) # <-- Use tf.print(), instead of print(). You can print not just 'loss', but any TF tensor in this function using this approach.
return loss
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(1, input_shape=[8]))
model.compile(loss=custom_loss, optimizer="sgd")
x_data = tf.data.Dataset.from_tensor_slices([np.ones(8)] * 100)
y_data = tf.data.Dataset.from_tensor_slices([np.ones(1)] * 100)
data = tf.data.Dataset.zip((x_data, y_data)).batch(2)
model.fit_generator(data, steps_per_epoch=10, epochs=2)
The output looks like the following, which tells you the batch-by-batch loss values.
Epoch 1/2
0.415590227 1/10 [==>...........................] - ETA: 0s - loss: 0.41560.325590253
0.235590339
0.145590425
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523 10/10 [==============================] - 0s 11ms/step - loss: 0.1392 Epoch 2/2
0.0555904508 1/10 [==>...........................] - ETA: 0s - loss: 0.05560.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523
0.0555904508
0.034409523 10/10 [==============================] - 0s 498us/step - loss: 0.0450
Upvotes: 5