Reputation: 11
I have a tensor A
of shape (1, 768)
with gradient and a tensor B
of shape (2, 4, 768)
. I want to replace some values of tensor B
with tensor A
and have it pass back the gradient normally. However, direct assignment like B[batch][replace_ids].data = A
seems to lose all gradients in A while B[batch][replace_ids] = A
will get a RuntimeError: a view of a leaf Variable that requires grad is being used in an in-place operation.
Is there any feasible way?
Thanks in advance.
Upvotes: 1
Views: 386
Reputation: 4826
Would be great if we can see a MWE but I guess you can try
B = B.clone()[batch, replace_ids] = A
Upvotes: 1