Reputation: 89
I have a two layer network built in pytorch and two two different optimizers. I would like to use one optimizer on the first layer and the other optimizers on the second layer. Is this possible?
Upvotes: 1
Views: 3293
Reputation: 11638
Yes this is possible: When initializing an optimizer you need to pass it the parameters that you want to optimize which is where you have to do this division. For instance:
import torch.nn as nn
import torch.optim
net = nn.Sequential(
nn.Linear(1, 3),
nn.Linear(3, 5),
nn.Linear(5, 1)
)
opt1 = torch.optim.Adam(params=net[0].parameters(), lr=0.1)
opt2 = torch.optim.Adam(params=[*net[1].parameters(), *net[2].parameters()], lr=0.001)
Upvotes: 6