user530316
user530316

Reputation: 89

How to use different optimizers for each model layer Pytorch?

I have a two layer network built in pytorch and two two different optimizers. I would like to use one optimizer on the first layer and the other optimizers on the second layer. Is this possible?

Upvotes: 1

Views: 3293

Answers (1)

flawr
flawr

Reputation: 11638

Yes this is possible: When initializing an optimizer you need to pass it the parameters that you want to optimize which is where you have to do this division. For instance:

import torch.nn as nn
import torch.optim

net = nn.Sequential(
    nn.Linear(1, 3),
    nn.Linear(3, 5),
    nn.Linear(5, 1)
)

opt1 = torch.optim.Adam(params=net[0].parameters(), lr=0.1)
opt2 = torch.optim.Adam(params=[*net[1].parameters(), *net[2].parameters()], lr=0.001)

Upvotes: 6

Related Questions