fferri
fferri

Reputation: 18950

pytorch grad is None after .backward()

I just installed torch-1.0.0 on Python 3.7.2 (macOS), and trying the tutorial, but the following code:

import torch
x = torch.ones(2, 2, requires_grad=True)
y = x + 2
z = y * y * 3
out = z.mean()
out.backward()
print(out.grad)

prints None which is not what's expected.

What's the problem?

Upvotes: 20

Views: 17203

Answers (2)

patapouf_ai
patapouf_ai

Reputation: 18743

If you want the non-leaf gradients you can use register_hook on your non-leaf tensors in order to save them somewhere (as shown in the following answer: How to return intermideate gradients (for non-leaf nodes) in pytorch? ) .

Upvotes: 1

Umang Gupta
Umang Gupta

Reputation: 16480

This is the expected result.

.backward accumulate gradient only in the leaf nodes. out is not a leaf node, hence grad is None.

autograd.backward also does the same thing

autograd.grad can be used to find the gradient of any tensor w.r.t to any tensor. So if you do autograd.grad (out, out) you get (tensor(1.),) as output which is as expected.

Ref:

Upvotes: 18

Related Questions