IllyShaieb
IllyShaieb

Reputation: 111

PyTorch warning about using a non-full backward hook when the forward contains multiple autograd Nodes

After a recent upgrade, when running my PyTorch loop, I now get the warning

Using a non-full backward hook when the forward contains multiple autograd Nodes`".

The training still runs and completes, but I am unsure where I am supposed to place the register_full_backward_hook function.

I have tried adding it to each of the layers in my neural network but this gives further errors about using different hooks.

Can anyone please advise?

Upvotes: 8

Views: 7838

Answers (1)

Ivan
Ivan

Reputation: 40648

PyTorch version 1.8.0 deprecated register_backward_hook (source code) in favor of register_full_backward_hook (source code).

You can find it in the patch notes here: Deprecated old style nn.Module backward hooks (PR #46163)

The warning you're getting:

Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior.

Simply indicates that you should replace all register_backward_hook calls with register_full_backward_hook in your code to get the behavior described in the documentation page.

Upvotes: 2

Related Questions