etnamaid
etnamaid

Reputation: 29

MultiplicativeLR scheduler not working properly when call scheduler.step()

PytorchLightning Framework, I am configuring the optimizers like this:

    def configure_optimizers(self):
    opt = torch.optim.Adam(self.model.parameters(), lr=cfg.learning_rate)
    #modified to fit lightning
    sch = torch.optim.lr_scheduler.MultiplicativeLR(opt, lr_lambda = 0.95) #decrease of 5% every epoch
    
    return [opt], [sch]

Then in the training_step, I can either call manually the lr_scheduler or let lightning do it automatically. Fact is that in any case I got this kind of error:

 lr_scheduler["scheduler"].step()
  File "/home/lsa/anaconda3/envs/randla_36/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 152, in step
    values = self.get_lr()
  File "/home/lsa/anaconda3/envs/randla_36/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 329, in get_lr
    for lmbda, group in zip(self.lr_lambdas, self.optimizer.param_groups)]
  File "/home/lsa/anaconda3/envs/randla_36/lib/python3.6/site-packages/torch/optim/lr_scheduler.py", line 329, in <listcomp>
    for lmbda, group in zip(self.lr_lambdas, self.optimizer.param_groups)]
TypeError: 'float' object is not callable

But ifI use any other scheduler, not only VSCode recognize it as belonging to pytorch, I also do not get this error.

Pytorch version 1.10 Lightning Version 1.5

Upvotes: 0

Views: 438

Answers (1)

Andrey Lukyanenko
Andrey Lukyanenko

Reputation: 3851

I think that you need to change the value of `lr_lambda'. Here is the link to the documentation: https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.MultiplicativeLR.html

lr_lambda (function or list) – A function which computes a multiplicative factor given an integer parameter epoch, or a list of such functions, one for each group in optimizer.param_groups.

So, if you want a decrease of 5% every epoch, then you could do the following:


def configure_optimizers(self):
    opt = torch.optim.Adam(self.model.parameters(), lr=cfg.learning_rate)
    #modified to fit lightning
    lmbda = lambda epoch: 0.95
    sch = torch.optim.lr_scheduler.MultiplicativeLR(opt, lr_lambda = lmbda) #decrease of 5% every epoch
    
    return [opt], [sch]

Upvotes: 1

Related Questions