IsshikiHugh
IsshikiHugh

Reputation: 1

How to control the `trainer.global_step` in pytorch lightning?

In pytorch lightning, the trainer will increase the global_step according to how many times xxx_optimizer.step() was executed when self.automatic_optimization = False.

However, I have two different optimizers in a single training step, which means self.trainer.global_step will be doubled.

That's annoying because the self.trainer.global_step will always be doubled when I enable the second optimizer, which makes my checkpoints callbacks run in a wrong way. They will misalign with other checkpoints that from experiments without second optimizer.

Is there any simple way I can solve this problem?

Stress: I am trying to control the global_step to make sure my Checkpoint Callback works correctly. (Because I am not sure wether other things will be wrong as well if I only modify the callback.)

I try to overwrite the self.trainer.global_step, but I found that it's a getter function and can't be modified.

Upvotes: 0

Views: 223

Answers (0)

Related Questions