user3768533
user3768533

Reputation: 1357

Torch DoubleTensor CharTensor Incompatibility

Essentially how does one multiply a CharTensor mask with a DoubleTensor? I making a module for torch, it is supposed to be similar to the ReLU module. Therefore self.mask is supposed to be a mask of 0s and 1s that I want to hit the DoubleTensor self.output with. Currently self.mask is a CharTensor and I am not able to multiply the two as seen in the log. It seems so trivial but I have been looking online forever now, unable to find a solution.

Infinite thanks,

-an anxious coder

function ReQU:updateOutput(input)

...

self.mask = torch.gt(input, 0)

self.output:cmul(self.mask)

...

Log: invalid arguments: DoubleTensor ByteTensor expected arguments: DoubleTensor [DoubleTensor] DoubleTensor stack traceback: [C]: in function 'cmul' ./requ.lua:21: in function 'forward'

Upvotes: 0

Views: 202

Answers (1)

mbrenon
mbrenon

Reputation: 4941

Torch exposes methods to do this without the need to multiply values by yourself. The simplest is probably to invert your mask and use the index [] operator:

self.mask = torch.le(input, 0)
self.output[self.mask] = 0

Upvotes: 2

Related Questions