Zubayr
Zubayr

Reputation: 456

LBFGS Giving Tensor Object not Callable Error when using Optimizer.step

I am trying to use sgd, adam, and LBFGS optimizer.

The part of the code is:

for batch_idx, (inputs, targets) in enumerate(trainloader):
            batch_size = inputs.size(0)
            total += batch_size
            one_hot_targets = torch.FloatTensor(batch_size, 10).zero_()
            one_hot_targets = one_hot_targets.scatter_(1, targets.view(batch_size, 1), 1.0)
            one_hot_targets = one_hot_targets.float()
            if use_cuda:
                inputs, one_hot_targets = inputs.cuda(), one_hot_targets.cuda()
            inputs, one_hot_targets = Variable(inputs), Variable(one_hot_targets)
            
            
            if optimizer_val=='sgd' or optimizer_val=='adam':
              outputs = F.softmax(net(inputs))
              loss = criterion(outputs, one_hot_targets)

              loss.backward()
              optimizer.step()

            else:
              def closure():
                optimizer.zero_grad()
                outputs = F.softmax(net(inputs))
                loss = criterion(outputs, one_hot_targets)
                loss.backward()
                return loss    

              optimizer.step(closure())

In the optimizer.step(closure()) part in LBFGS (running in else) I am getting this error:

TypeError: 'Tensor' object is not callable

I checked, the loss is tensor type.

How to make it work?

Upvotes: 0

Views: 528

Answers (1)

Ivan
Ivan

Reputation: 40668

You need to pass a function callback to the optimizer.step function, don't call it:

optimizer.step(closure)

Upvotes: 2

Related Questions