mon
mon

Reputation: 22326

What are the values for arguments and their meanings of tf.keras.callbacks.EarlyStopping

tf.keras.callbacks.EarlyStopping has arguments such as monitor argument but the documentation does not mention values that can be specified. Where I can find them?

tf.keras.callbacks.EarlyStopping(
    monitor="val_loss",
    min_delta=0,
    patience=0,
    verbose=0,
    mode="auto",
    baseline=None,
    restore_best_weights=False,
)

monitor

Where are the available values documented?

baseline

baseline: Baseline value for the monitored quantity. Training will stop if the model doesn't show improvement over the baseline.

Please explain what this exactly means. If it is set to 0.6 and the monitor value is accuracy, then if the first epoch accuracy is 0.5, will the execution stop there?

verbose

What values to specify? 1 is the most verbose? What is the max value and what the each level means?

Upvotes: 1

Views: 766

Answers (2)

vERISBABY.
vERISBABY.

Reputation: 366

  • monitor argument of tf.keras.callbacks.EarlyStopping has 4 values: 'loss','accuracy','val_loss','val_accuracy'.
  • model.fit() will stop training when a monitored metric (ex: accuracy value) has stopped improving at a specific value (max value as u say).
  • The accuracy rate of the model depends on many factors you done before training the module and add them to the model's layers, arguments such as how you preprocess the data, the dataset you provide, data augmentation...

Upvotes: 0

Poe Dator
Poe Dator

Reputation: 4913

The available metrics are those log items that you can see in history object or in Tensorboard. Below is an excerpt from TF source code at tensorflow/python/keras/callbacks.py:

Typically the metrics are set by the
        `Model.compile` method. Note:
        * Prefix the name with `"val_`" to monitor validation metrics.
        * Use `"loss"` or "`val_loss`" to monitor the model's total loss.
        * If you specify metrics as strings, like `"accuracy"`, pass the same
          string (with or without the `"val_"` prefix).
        * If you pass `metrics.Metric` objects, `monitor` should be set to
          `metric.name`
        * If you're not sure about the metric names you can check the contents
          of the `history.history` dictionary returned by
          `history = model.fit()`
        * Multi-output models set additional prefixes on the metric names.

Upvotes: 1

Related Questions