Reputation: 2026
Is it possible to add custom weights to the training instances in PyTorch? More explicitly, I'd like to add a custom weight for every row in my dataset. By default, the weights are 1, which means every data is equally important for my model.
Upvotes: 5
Views: 7630
Reputation: 16856
Loss functions support class weights not sample weights. For sample weights you can do something like below (commented inline):
import torch
x = torch.rand(8, 4)
# Ground truth
y = torch.randint(2, (8,))
# Weights per sample
weights = torch.rand(8, 1)
# Add weights as a columns, so that it will be passed trough
# dataloaders in case you want to use one
x = torch.cat((x, weights), dim=1)
model = torch.nn.Linear(4, 2)
loss_fn = torch.nn.CrossEntropyLoss(reduction='none')
def weighted_loss(y, y_hat, w):
return (loss_fn(y, y_hat)*w).mean()
loss = weighted_loss(model(x[:, :-1]), y, x[:, -1])
print (loss)
Upvotes: 8