esh3390
esh3390

Reputation: 125

"RuntimeError: a view of a leaf Variable that requires grad is being used in an in-place operation" when there's actually no in-place operations

I am working on some paper replication, but I am having trouble with it.

According to the log, it says that RuntimeError: a view of a leaf Variable that requires grad is being used in an in-place operation. However, when I check the line where the error is referring to, it was just a simple property setter inside the class:

@pdfvec.setter
def pdfvec(self, value):
   self.param.pdfvec[self.key] = value   # where the error message is referring to 

Isn't in-place operations are something like += or *= etc.? I don't see why this error message appeared in this line.

I am really confused about this message, and I will be glad if any one knows any possible reason this can happen.

For additional information, this is the part where the setter function was called:

def _update_params(params, pdfvecs):
    idx = 0
    for param in params:
        totdim = param.stats.numel()
        shape = param.stats.shape
        param.pdfvec = pdfvecs[idx: idx + totdim].reshape(shape)   # where the setter function was called
        idx += totdim

I know this can still lack information for solving the problem, but if you know any possiblity why the error message appeared I would be really glad to hear.

Upvotes: 8

Views: 16061

Answers (2)

Fang WU
Fang WU

Reputation: 423

Loosely, tensors you create directly are leaf variables. An in-place operation is something which modifies the data of a variable. PyTorch doesn’t allow in-place operations on leaf variables that have requires_grad=True (such as parameters of your model) because the developers could not decide how such an operation should behave. If you want the operation to be differentiable, you can work around the limitation by cloning the leaf variable (or use a non-inplace version of the operator).

My suggestions:

 with torch.no_grad():
    (your in-place operation here)

Hope this helps.

Upvotes: 1

l4rmbr
l4rmbr

Reputation: 1307

In-place operation means the assignment you've done is modifiying the underlying storage of your Tensor, of which requires_grad is set to True, according to your error message.

That said, your param.pdfvec[self.key] is not a leaf Tensor, because they will be updated during back-propagation. And you tried to assign a value to it , that will interference with autograd, so this action is prohibited by default. You can do this by directly modifying its underlying storage(f.e., with .data).

Upvotes: 12

Related Questions