mathnoob123
mathnoob123

Reputation: 229

Timing early stopping in terms of epochs

I have model a neural network in Keras which attains a train set accuracy of 1 at epoch 1000 (on the used hyperparameters), yet the validation accuracy keeps fluctuating between 0.78 and 0.8.

I would like an Early Stopping command which monitors the validation accuracy but only starts after the 1000th epoch because, before the 1000th epoch, the validation accuracy keeps highly fluctuating. So my strategy is to maximize training set accuracy and then stop the learning as soon, as we encounter a high value for validation accuracy. (The theoretical best is 1.0 for training and 0.8 for validation)

Is such a callback function possible?

Upvotes: 0

Views: 905

Answers (1)

cemsazara
cemsazara

Reputation: 1683

I updated the answer again, sorry I missed the epoch part. You need to define your own early stop function. This answer to another question can help. Using that answer and changing a little using "epoch_threshold":

class EarlyStoppingByLossVal(Callback):
    def __init__(self, monitor='val_loss', value=0.00001, verbose=0):
        super(Callback, self).__init__()
        self.monitor = monitor
        self.value = value
        self.verbose = verbose
        self.epoch_threshold = 1000
    def on_epoch_end(self, epoch, logs={}):
        current = logs.get(self.monitor)
        if current is None:
            warnings.warn("Early stopping requires %s available!" % self.monitor, RuntimeWarning)

        if current < self.value and epoch > self.epoch_threshold:
            if self.verbose > 0:
                print("Epoch %05d: early stopping THR" % epoch)
            self.model.stop_training = True

Upvotes: 2

Related Questions