user11173832
user11173832

Reputation:

volatile was removed and now had no effect use with.torch.no_grad() instread

my torch program stopped at this point I guess i can not use volatile=True
how should I change it and what is the reason to stop?
and How should I change this code?

images = Variable(images.cuda())
targets = [Variable(ann.cuda(), volatile=True) for ann in targets]

train.py:166: UserWarning: volatile was removed and now has no effect. Use with torch.no_grad(): instead.

Upvotes: 2

Views: 5608

Answers (1)

jodag
jodag

Reputation: 22214

Variable doesn't do anything and has been deprecated since pytorch 0.4.0. Its functionality was merged with the torch.Tensor class. Back then the volatile flag was used to disable the construction of the computation graph for any operation which the volatile variable was involved in. Newer pytorch has changed this behavior to instead use with torch.no_grad(): to disable construction of the computation graph for anything in the body of the with statement.

What you should change will depend on your reason for using volatile in the first place. No matter what though you probably want to use

images = images.cuda()
targets = [ann.cuda() for ann in targets]

During training you would use something like the following so that the computation graph is created (assuming standard variable names for model, criterion, and optimizer).

output = model(images)
loss = criterion(images, targets)
optimizer.zero_grad()
loss.backward()
optimizer.step()

Since you don't need to perform backpropagation during evaluation you would use with torch.no_grad(): to disable the creation of the computation graph which reduces the memory footprint and speeds up computation.

with torch.no_grad():
    output = model(images)
    loss = criterion(images, targets)

Upvotes: 5

Related Questions