Reputation: 428
In the case of overfitting, to my knowledge the val_loss
has to soar as opposed to the train_loss
.
But how about the case below (val_loss
remains low)? Is this model underfitting horribly? Or is it some completely different case?
Previously my models would overfit badly so I added the dropout of 0.3 (4 CuDNNGRU layers with 64 neurons and one Dense layer and batchsize of 64), so should I reduce the dropout?
Upvotes: 2
Views: 208
Reputation: 7745
This is neither overfitting nor underfitting. Some people refer to it as Unknown fit. Validation << training loss happens when you apply regularization (L1, L2, Dropout, ...) in keras because they are applied to training only and not on testing (validating). So it makes sense that your training loss is bigger (not all neurons are available for feed forward due to dropout for example).
But what is clear is that your model is not being optimized for your validation set, (almost a flat line). This can be due to many things:
Hope these tips help you out.
Upvotes: 2