Lucas Kim
Lucas Kim

Reputation: 133

backward, grad function in pytorch

I'm trying to implement backward, grad function in pytorch.

But, I don't know why this value is returned.

Here is my code.

x = Variable(torch.FloatTensor([[1,2],[3,4]]), requires_grad=True)
y = x + 2
z = y * y

gradient = torch.ones(2, 2)
z.backward(gradient)
print(x.grad)

I think that result value should be [[6,8],[10,12]]

Because of dz/dx= 2*(x+2) and x=1,2,3,4

But returned value is [[7,9],[11,13]]

Why this is happened.. I want to know how gradient, grad function is doing.

Help me please..

Upvotes: 4

Views: 2525

Answers (1)

Shubham Jain
Shubham Jain

Reputation: 40

The below piece of code on pytorch v0.12.1

import torch
from torch.autograd import Variable
x = Variable(torch.FloatTensor([[1,2],[3,4]]), requires_grad=True)
y = x + 2
z = y * y
gradient = torch.ones(2, 2)
z.backward(gradient)
print(x.grad)

returns

Variable containing:
  6   8
 10  12
[torch.FloatTensor of size 2x2]

Update your pytorch installation. This explains the working of autograd, which handles gradient computation for pytorch.

Upvotes: 2

Related Questions