Reputation: 2270
We set model.train() during training, but during my training iterations, I also want to do a forward pass of the training dataset to see what my new loss is. When doing this, should I temporarily set model.eval()?
Upvotes: 2
Views: 1941
Reputation: 24691
If your network has layers which act different during inference (torch.nn.BatchNormNd
and torch.nn.DropoutNd
could be an example, for the second case all neurons will be used but scaled by inverted probability of keeping neurons, see here or here for example) and you want to test how your network performs currently (which is usually called a validation
step) then it is mandatory to use module.eval()
.
It is a common (and very good!) practice to always switch to eval
mode when doing inference-like things no matter if this changes your actual model.
EDIT:
You should also use with torch.no_grad():
block during inference, see official tutorial code as gradients are not needed during this phase and it's wasteful to compute them.
Upvotes: 1