Weimin Chan
Weimin Chan

Reputation: 337

How to add a L2 regularization term in my loss function

I’m going to compare the difference between with and without regularization, so I want to custom two loss functions.

My loss function with L2 norm:

enter image description here

###NET  
class CNN(nn.Module):
def __init__(self):
    super(CNN,self).__init__()
    self.layer1 = nn.Sequential(
        nn.Conv2d(3, 16, kernel_size = 5, padding=2),
        nn.ReLU(),
        nn.MaxPool2d(2))
    self.layer2 = nn.Sequential(
        nn.Conv2d(16, 32, kernel_size = 5, padding=2),
        nn.ReLU(),
        nn.MaxPool2d(2))
    self.layer3 = nn.Sequential(
        nn.Conv2d(32, 32, kernel_size = 5, padding=2),
        nn.ReLU(),
        nn.MaxPool2d(4))
    self.fc = nn.Linear(32*32*32,11)
def forward(self, x):
    out = self.layer1(x)
    out = self.layer2(out)
    out = self.layer3(out)
    out = out.view(out.size(0), -1)
    out = self.fc(out)
    return out

net = CNN()

###OPTIMIZER
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(net.parameters(), lr = LR, momentum = MOMENTUM)

1.How can I add a L2 norm in my loss function?

2.If I want to write the loss function by myself (without using optim.SGD) and do the grad-decent by autograd, how can I do?

Thanks for your help!

Upvotes: 2

Views: 7769

Answers (1)

Shai
Shai

Reputation: 114796

You can explicitly compute the norm of the weights yourself, and add it to the loss.

reg = 0
for param in CNN.parameters():
  reg += 0.5 * (param ** 2).sum()  # you can replace it with abs().sum() to get L1 regularization
loss = criterion(CNN(x), y) + reg_lambda * reg  # make the regularization part of the loss
loss.backward()  # continue as usuall

See this thread for more info.

Upvotes: 4

Related Questions