Reputation: 1290
I am new to pytorch and I can't run backward() on even the most simple network without generating an error. For example:
(Linear(6, 6)(Variable(torch.zeros([10, 6]))) - Variable(torch.zeros([10, 6]))).backward()
Throws the following error
{RuntimeError}element 0 of variables does not require grad and does not have a grad_fn
What have I done wrong in the code to create this issue?
Upvotes: 2
Views: 5050
Reputation: 1334
This error happens when PyTorch could not find parameters of the model that have requires_grad = True
i.e. all the model parameters have requires_grad = False
.
There are different reasons, but it can be that you are freezing all the model, or that you're not correctly swapping the final layers of the model - for example, in rest net it should be model.fc
and not model.classifier
-.
You always have to be careful where you place this:
for param in model.parameters():
param.requires_grad = False
Upvotes: 5
Reputation: 299
Try adding a grad_output of matching shape as a parameter to backward:
(Linear(6, 6)(Variable(torch.zeros([10, 6]))) - Variable(torch.zeros([10, 6]))).backward(torch.zeros([10, 6]))
The following answer has more details: Why should be the function backward be called only on 1 element tensor or with gradients w.r.t to Variable?
Upvotes: 2