Chutlhu
Chutlhu

Reputation: 359

How to extend a Loss Function Pytorch

I would like to create my own custom Loss function as a weighted combination of 3 Loss Function, something similar to:

criterion = torch.nn.CrossEntropyLoss(out1, lbl1) + \
            torch.nn.CrossEntropyLoss(out2, lbl2) + \
            torch.nn.CrossEntropyLoss(out3, lbl3)

I am doing it to address a multi-class multi-label classification problem. Does it make sense? How to implement correctly such Loss Function in Pytorch?

Thanks

Upvotes: 1

Views: 1096

Answers (1)

kmario23
kmario23

Reputation: 61485

Your way of approaching the problem seems correct but there's a typo in your code. Here's a fix for that:

loss1 = torch.nn.CrossEntropyLoss()(out1, lbl1)
loss2 = torch.nn.CrossEntropyLoss()(out2, lbl2) 
loss3 = torch.nn.CrossEntropyLoss()(out3, lbl3)

final_loss = loss1 + loss2 + loss3

Then you can call .backward on final_loss which should then compute the gradients and backpropagate them.

Also, it's possible to weight each of the component losses where the weights are itself learned during the training process.

You can refer the discussions of combine-multiple-criterions-to-a-loss-function for more information.

Upvotes: 2

Related Questions