PyTorch - How to get learning rate during training?

While training, I'd like to know the value of learning_rate. What should I do?

It's my code, like this:

my_optimizer = torch.optim.SGD(my_model.parameters(), 
                               lr=0.001, 
                               momentum=0.99, 
                               weight_decay=2e-3)

Thank you.

Upvotes: 36

Views: 66962

Answers (5)

Gold_Leaf
Gold_Leaf

Reputation: 53

As I read through the source code of torch.optim.Optimizer and one of its subclasses, say torch.optim.Adam, it is clear to me that the lr is stored in a dictionary named defaults in the subclass, and then this dictionary is passed into the __init__ method of the base class, which is the Optimizer class, along with params. Upon receiving the arguments, the base class then saves defaults into its own attribute with the same name. Here is a small fraction of its source code:

def __init__(self, params, defaults):
    torch._C._log_api_usage_once("python.optimizer")
    self.defaults = defaults
    ...

Therefore, to retrieve the value of the learning rate from a torch's optimizer, I believe the solution would be:

optimizer.defaults['lr']

Upvotes: 1

Andrey Taranov
Andrey Taranov

Reputation: 79

Use

optimizer.param_groups[-1]['lr']

Upvotes: 7

votrinhan88
votrinhan88

Reputation: 41

As of PyTorch 1.13.0, one can access the list of learning rates via the method scheduler.get_last_lr() - or directly scheduler.get_last_lr()[0] if you only use a single learning rate.

Said method can be found in the schedulers' base class LRScheduler (See their code). It actually returns the attribute scheduler._last_lr in the base class as Zahra has mentioned but calling the method should be more preferred.

Edit: Thanks @igorkf for the reply

Upvotes: 4

Zahra
Zahra

Reputation: 7207

Alternatively, you may use an lr_scheduler along with your optimizer and simply call the built-in lr_scheduler.get_lr() method.

Here is an example:

my_optimizer = torch.optim.Adam( my_model.parameters(), 
                                 lr = 0.001, 
                                 weight_decay = 0.002)

my_lr_scheduler = torch.optim.lr_scheduler.StepLR( my_optimizer, 
                                                step_size = 50, 
                                                gamma = 0.1)

# train
...
my_optimizer.step()
my_lr_scheduler.step()

# get learning rate
my_lr = my_lr_scheduler.get_lr()
# or
my_lr = my_lr_scheduler.optimizer.param_groups[0]['lr']

The added benefit for using lr_scheduler is more controls on changing lr over time; lr_decay, etc. For lr_scheduler args, refer to pytorch docs.

Upvotes: 13

MBT
MBT

Reputation: 24099

For only one parameter group like in the example you've given, you can use this function and call it during training to get the current learning rate:

def get_lr(optimizer):
    for param_group in optimizer.param_groups:
        return param_group['lr']

Upvotes: 47

Related Questions