iOSnewbie
iOSnewbie

Reputation: 53

How to debug custom metric values in tf.keras

I have defined a very simple custom metric, in tf.keras, for tracking number of pixels predicted as '1' for a segmentation problem. Since the output from the last layer has sigmoid activation, I'm rounding y_pred and then summing. I expect to see a whole integer value (>= 0) (because of the rounding) but the output shows floating point numbers like 0.28. How is that possible? How can I debug this to figure out where the problem is?

I tried switching from tf.keras.backend.sum & tf.keras.backend.round to tf.reduce_sum & tf.round but that didnt solve the issue

def num_ones(y_true, y_pred):
    return tf.keras.backend.sum(tf.keras.backend.flatten(tf.keras.backend.round(y_pred)))

model.compile(optimizer = tf.train.AdamOptimizer(learning_rate = 1e-4), loss = 'binary_crossentropy', metrics = ['accuracy', num_ones])


output-
INFO:tensorflow:Saving dict for global step 3408: accuracy = 0.9551756, global_step = 3408, loss = 0.7224839, num_ones = 0.28

Upvotes: 4

Views: 1208

Answers (2)

Grzegorz Wilczyński
Grzegorz Wilczyński

Reputation: 71

Function

tf.config.run_functions_eagerly(True)

works fine with Tensorflow >2.3 but you have to define your custom metric function as tensorflow function (add the decorator):

@tf.function
def num_ones(y_true, y_pred):
    return tf.keras.backend.sum(tf.keras.backend.flatten(tf.keras.backend.round(y_pred)))

Upvotes: 3

AndersonHappens
AndersonHappens

Reputation: 547

To answer how you should debug the custom metrics, call the following function at the top of your python script:

tf.config.experimental_run_functions_eagerly(True)

This will force tensorflow to run all functions eagerly (including custom metrics) so you can then just set a breakpoint and check the values of everything like you would normally in your debugger.

Upvotes: 2

Related Questions