manuel
manuel

Reputation: 563

In torch / nn, how to use nn.View(-1):setNumInputDims(2) with a minibatch 4D tensor?

I am writing a neural network with torch nn.

As part of it I have to transform a 3D tensor with dimensions a x b x c into a 2D tensor with dimensions a x b*c.

Here is the code:

input = torch.Tensor(a, b, c)  -- Arbitrary 3D tensor
net = nn.Sequential()
net:add(nn.View(-1):setNumInputDims(2))
net:forward(input)

Now I want to modify my network so that it can handle minibatches as input. Therefore, I want to transform a 4D tensor with dimensions d x a x b x c into a 3D tensor with dimensions d x a x b*c where d is the number of elements in my minibatch. d is known in advance, but the other dimensions aren't.

When I feed a 4D tensor into the network above, I get a 2D tensor d*a x b*c out of it. How do I have to modify the network so that it creates a 3D tensor instead as desired?

I have tried different combinations like nn.View(-1).setNumInputDims(3), nn.View(d, -1).setNumInputDims(2) and nn.View(d, -1).setNumInputDims(3) but none of them created a tensor of the format I wanted.

Upvotes: 0

Views: 2995

Answers (2)

roger
roger

Reputation: 1

My idea is to transpose 2nd dim into 1st dim, and reshape has an optional boolean argument to take care of the unknown dimension. Hacky but seems working.
nn.Transpose({1, 2})
nn.Reshape(d, -1, true)
nn.Transpose({1, 2})

Upvotes: 0

manuel
manuel

Reputation: 563

nn seems to be very limited in this case. The best I could find to solve my problem is to create a parallel-like network via. nn.Concat and nn.Select:

local net = nn.Concat(1)
for i=1, d do
  local subNet = nn.Sequential()
  subNet:add(nn.Select(1, i))
  subNet:add(nn.View(-1):setNumInputDims(2))
  subNet:add(nn.Replicate(1, 1))
  net:add(subNet)
end

Note: You really need nn.Replicate in this case, otherwise it will create a d*a x b*c tensor as well.

P.S. If someone can provide a better solution (preferably without splitting), I will give him or her the accepted answer marking.

Upvotes: 1

Related Questions