filippo
filippo

Reputation: 5294

Keras, print metrics only for validation data

Sorry for the trivial question, please point me to a better source if this is not the proper place.

Is there a way to print keras metrics only for validation_data?

I'd like to track a couple of epoch specific metrics (e.g. precision, recall and f1 score). There are a couple of handy methods for them available in keras git history, but they make no sense when computed batch-wise at training time, while they do at test/validation time.

So the training metrics are just cluttering my logs for nothing. Is there a way to mute them?

EDIT: I know I can subclass Callback and only do those on epoch end, but that way I'm doing two predictions on validation data per epoch, one in my callback and one done by keras under the hood to compute validation loss.

Upvotes: 4

Views: 4270

Answers (1)

Yu-Yang
Yu-Yang

Reputation: 14619

The ProgbarLogger callback is added to the model only if verbose > 0. In the source code training.py:

if verbose:
    if steps_per_epoch is not None:
        count_mode = 'steps'
    else:
        count_mode = 'samples'
    callbacks += [cbks.ProgbarLogger(count_mode)]

So a possible workaround is:

  • Specify verbose=0 in fit() to suppress the built-in ProgbarLogger
  • Subclass ProgbarLogger and change the code to ignore the training metrics
  • Add this callback when calling fit()

For example,

from keras.callbacks import ProgbarLogger

class ValOnlyProgbarLogger(ProgbarLogger):
    def __init__(self, verbose, count_mode='samples'):
        # Ignore the `verbose` argument specified in `fit()` and pass `count_mode` upstream
        self.verbose = verbose
        super(ValOnlyProgbarLogger, self).__init__(count_mode)

    def on_train_begin(self, logs=None):
        # filter out the training metrics
        self.params['metrics'] = [m for m in self.params['metrics'] if m.startswith('val_')]
        self.epochs = self.params['epochs']

input_tensor = Input(shape=(256,))
out = Dense(10)(input_tensor)
model = Model(input_tensor, out)
model.compile(loss='mse', optimizer='adam', metrics=['mae', 'cosine'])
model.fit(X, Y, validation_data=(XX, YY), verbose=0,
          callbacks=[ValOnlyProgbarLogger(verbose=1)])

The training metrics will now be suppressed:

Epoch 1/1
1000/1000 [==============================] - 0s 392us/step - val_loss: 0.2479 - val_mean_absolute_error: 0.3988 - val_cosine_proximity: -0.7022

Note that if you're using fit_generator instead of fit, you'll need to specify count_mode='steps' when initializing ValOnlyProgbarLogger.

Upvotes: 3

Related Questions