Reputation: 31
I am trying to save the logs
value that gets passed into a tf.keras callback at the end of every epoch to track how the model did along the way.
I have tried writing a custom tf.keras callback that appends the logs
value that is passed to the callback to an array that I initialized at the beginning of training using the same callback. However upon debugging after the first epoch the array that I initialized in the model is None
at the end of the second epoch. Below is the custom callback I created.
class LogEpochScores(tf.keras.callbacks.Callback):
def __init__(self):
super(LogEpochScores, self).__init__()
def on_train_begin(self, logs=None):
self.model.epoch_log = []
def on_epoch_end(self, epoch, logs=None):
self.model.epoch_log = self.model.epoch_log.append(logs)
I expected the following array at the end of training
[
{loss: 1, acc:1, val_loss:1, val_acc:1},
{loss: 2, acc:2, val_loss:2, val_acc:2},
{loss: 3, acc:1, val_loss:3, val_acc:3},
{loss: 4, acc:1, val_loss:4, val_acc:4},
{loss: 5, acc:1, val_loss:5, val_acc:5},
{loss: 6, acc:1, val_loss:6, val_acc:6}
]
Each entry being the results from the respective epoch.
The output actually is None
Edit: Formatting
Upvotes: 1
Views: 1683
Reputation: 31
Apparently I am not intelligent. I was using append the wrong way. Its destructive so you don't set it as a variable after use. The corrected code is as follows:
class LogEpochScores(tf.keras.callbacks.Callback):
def __init__(self):
super(LogEpochScores, self).__init__()
def on_train_begin(self, logs=None):
self.model.epoch_log = []
def on_epoch_end(self, epoch, logs=None):
self.model.epoch_log.append(logs)
Notice in on_epoch_end
I change
self.model.epoch_log = self.model.epoch_log.append(logs)
to
self.model.epoch_log.append(logs)
Additionally, in the tf.kera model object, there is an attribute called history which is exactly what I needed. So in the end there was no need for this callback
Upvotes: 2