user3550366
user3550366

Reputation: 125

Custom Loss in Pytorch where object does not have attribute backward()

I am new to pytorch and I tried creating my own custom loss. This has been really challenging. Below is what I have for my loss.

class CustomLoss(nn.Module):

    def __init__(self,  size_average=True, reduce=True):
        """

        Args:
            size_average (bool, optional): By default, the losses are averaged
               over observations for each minibatch. However, if the field
               size_average is set to ``False``, the losses are instead summed for
               each minibatch. Only applies when reduce is ``True``. Default: ``True``
            reduce (bool, optional): By default, the losses are averaged
               over observations for each minibatch, or summed, depending on
               size_average. When reduce is ``False``, returns a loss per input/target
               element instead and ignores size_average. Default: ``True``
        """
        super(CustomLoss, self).__init__()      


    def forward(self, S, N, M, type='softmax',):

        return self.loss_cal(S, N, M, type)



    ### new loss cal
    def loss_cal(self, S, N, M, type="softmax",):
        """ calculate loss with similarity matrix(S) eq.(6) (7)
        :type: "softmax" or "contrast"
        :return: loss
        """

        self.A = torch.cat([S[i * M:(i + 1) * M, i:(i + 1)]
                               for i in range(N)], dim=0)        
        self.A = torch.autograd.Variable(self.A)        


        if type == "softmax":
            self.B = torch.log(torch.sum(torch.exp(S.float()), dim=1, keepdim=True) + 1e-8)
            self.B = torch.autograd.Variable(self.B)       
            total = torch.abs(torch.sum(self.A - self.B))        
        else:
            raise AssertionError("loss type should be softmax or contrast !")
        return total

When I run the following:

loss = CustomLoss()          
(loss.loss_cal(S=S,N=N,M=M))
loss.backward()

I get the following error:

C:\Program Files\Anaconda3\lib\site-packages\IPython\core\interactiveshell.py in run_cell_magic(self, magic_name, line, cell)
   2113             magic_arg_s = self.var_expand(line, stack_depth)
   2114             with self.builtin_trap:
-> 2115                 result = fn(magic_arg_s, cell)
   2116             return result
   2117 

<decorator-gen-60> in time(self, line, cell, local_ns)

C:\Program Files\Anaconda3\lib\site-packages\IPython\core\magic.py in <lambda>(f, *a, **k)
    186     # but it's overkill for just that one bit of state.
    187     def magic_deco(arg):
--> 188         call = lambda f, *a, **k: f(*a, **k)
    189 
    190         if callable(arg):

C:\Program Files\Anaconda3\lib\site-packages\IPython\core\magics\execution.py in time(self, line, cell, local_ns)
   1178         else:
   1179             st = clock2()
-> 1180             exec(code, glob, local_ns)
   1181             end = clock2()
   1182             out = None

<timed exec> in <module>()

C:\Program Files\Anaconda3\lib\site-packages\torch\nn\modules\module.py in __getattr__(self, name)
    530                 return modules[name]
    531         raise AttributeError("'{}' object has no attribute '{}'".format(
--> 532             type(self).__name__, name))
    533 
    534     def __setattr__(self, name, value):

AttributeError: 'CustomLoss' object has no attribute 'backward'

Why am I getting this error? I did not face this error with TF. My understanding is, it has to do with the autograd? If someone can explain why I am facing this error, I can figure the rest out.

Upvotes: 1

Views: 4409

Answers (1)

cleros
cleros

Reputation: 4343

Hi!

the problem is that you try to call the backward function on the module, but not on the variable (as you probably want to). As you have not implemented a backward function on the module, the interpreter cannot find one. So what you want to do instead is:

loss_func = CustomLoss()          
loss = loss_func.loss_cal(S=S,N=N,M=M)
loss.backward()

As a general remark: You are using a nn.Module without it actually having parameters. While that works, this is not what nn.Modules are there for - and should therefore be avoided. Instead, simply make a pure function - after all, the function you have there is static anyways. If you really want to go for the class, think of the type of class you want to create - a loss. Losses, however, can have special pytorch properties. So you should read up on a discussion here.

Upvotes: 1

Related Questions