Reputation: 175
I tried a few neural networks and would like to know, when do I have a good trained model.
First, the training error goes down and stays constant at a certain number of epochs, the same with the validation error. Has it now converged?
Second, the training error goes down over all simulated epochs, and the validation error goes first down, but then increases after 10 epochs. I would have to stop there?
In which case do I have a better trained model? What is a desired behavior?
Thanks
Upvotes: 2
Views: 35
Reputation: 2372
Ideally, you would like both your training and validation error to decrease. But sometimes, if you train too much (i.e. your model tries too hard to learn your training dataset), it may lead to overfitting. That is the whole purpose behind using a validation set to keep a check on whether you are overfitting your dataset.
In the first case, since your error for both your training and validation set is going down nicely, it is possible that the model has trained well. But this does not mean that the model has found the global minimum (if that is what you were asking).
In the second case, after the point when your validation error starts increasing, your model has started to overfit the dataset. So you may want to stop your training at this point because training further will lead your model to overfit. This is also known as early stopping.
Hope this helps.
Upvotes: 1