rubiks
rubiks

Reputation: 53

Print the validation loss in each epoch in PyTorch

I want to print the model's validation loss in each epoch, what is the right way to get and print the validation loss?

Is it like this:

criterion = nn.CrossEntropyLoss(reduction='mean')
for x, y in validation_loader:
 optimizer.zero_grad()
 out = model(x)
 loss = criterion(out, y)
 loss.backward()
 optimizer.step()
 losses += loss

display_loss = losses/len(validation_loader)
print(display_loss)

or like this

criterion = nn.CrossEntropyLoss(reduction='mean')
for x, y in validation_loader:
     optimizer.zero_grad()
     out = model(x)
     loss = criterion(out, y)
     loss.backward()
     optimizer.step()
     losses += loss
    
display_loss = losses/len(validation_loader.dataset)
print(display_loss)

or something else? Thank you.

Upvotes: 4

Views: 7869

Answers (1)

Shai
Shai

Reputation: 114796

NO!!!!
enter image description here

Under no circumstances should you train your model (i.e., call loss.backward() + optimizer.step()) using validation / test data!!!

If you want to validate your model:

model.eval()  # handle drop-out/batch norm layers
loss = 0
with torch.no_grad():
  for x,y in validation_loader:
    out = model(x)  # only forward pass - NO gradients!!
    loss += criterion(out, y)
# total loss - divide by number of batches
val_loss = loss / len(validation_loader)

Note how optimizer has nothing to do with evaluating the model on the validation set. You do not change the model according to the validation data - only validate it.

Upvotes: 12

Related Questions