Reputation: 59
i'm training a CNN U-net model for semantic segmentation of images, however the training loss seems to decrease in a much faster rate than the validation loss, is this normal?
I'm using a loss of 0.002
The training and validation loss can be seen in the image bellow:
Upvotes: 3
Views: 2850
Reputation: 9871
Yes, this is perfectly normal.
As the NN learns, it infers from the training samples, that it knows better at each iteration. The validation set is never used during training, this is why it is so important.
Basically:
We usually use early stopping to avoid the last: basically, if your validation loss doesn't improve in X iterations, stop training (X being a value such as 5 or 10).
Upvotes: 5