MBT
MBT

Reputation: 24169

PyTorch - modifications of autograd variables

In my PyTorch program, I have a matrix which is continuously updated over runtime.

I wonder how to perform this update. I tried using something like this:

matrix[0, index] = hidden[0]

Both matrix and hidden are autograd Variables. When using the example above I'm getting the following error message:

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

I wonder how to get around this and perform the update without using in-place operations.

Probably slicing the matrix and creating a new matrix with torch.cat would work, but this doesn't seem to be a very nice solution.

Is there a better way doing this?

Thanks in advance!

Upvotes: 0

Views: 432

Answers (1)

Christian
Christian

Reputation: 1257

Maybe posting a piece of code could help, but have you tried using a dataset ? You could sequentially run through data efficiently with it.

http://pytorch.org/docs/master/data.html#torch.utils.data.TensorDataset

Upvotes: 1

Related Questions