Reputation: 966
I'm currently implementing a neural network that makes use of dropout. A question that came up was when to stop training.
Normally I would use early stopping to determine that point but in the original dropout-paper it is stated that "dropout allows much larger nets to be trained and removes the need for early stopping."
If they don't apply early stopping what stopping procedure do they use?
Upvotes: 1
Views: 458
Reputation: 43477
Early stopping refers to some scheme of stopping the training once you get good enough results, or once you stop seeing significant improvements for a while.
Their stopping procedure is just "run the training for x
number of iterations / epochs".
Upvotes: 2