Andrew
Andrew

Reputation: 85

Trying to get the val_loss from training a model

I have the following class in which I try to get a list of tuples containing the loss and validation loss per training

class LossHistory(keras.callbacks.Callback):
    def on_train_begin(self, logs={}):
        self.losses = []

    def on_batch_end(self, batch, logs={}):
        self.losses.append((logs.get('loss'), logs.get('val_loss')))

I initialize the LossHistory object before

history = LossHistory()

And then I pass it to the fit method on my model in the following way

regressor.fit(X_train, y_train, batch_size=32, epochs=200, validation_split = 0.2, callbacks = [history])

The problem is when I try to get history.losses[0][1] the value return is a noneType and it shouldn't be

I don't know what I'm doing wrong here, I'm kind of stuck on this for some time.

Upvotes: 1

Views: 1575

Answers (1)

Haydi80
Haydi80

Reputation: 26

My answer is late, but may be it can help someone else.

The on_batch_end is called at the end of each batch, at this moment we don't have the val_loss. The val_loss is calculated at the end of each epoch and not each batch. If you want to have also the val_loss at the end of each batch you should calculate it, but be aware that it will slow down the execution. May be you can do something like:

You should add an init to your class:

def __init__(self, validation_data):
    self.validation_data = validation_data
    self.val_losses = []

def on_batch_end(self, batch, logs={}):
    x, y = self.validation_data
    val_loss, val_acc = self.model.evaluate(x, y, verbose=0)
    self.val_losses.append(val_loss)
    self.losses.append(logs.get('loss'))

Upvotes: 1

Related Questions