Abhi Minhas
Abhi Minhas

Reputation: 37

Dimension out of range (expected to be in range of [-1, 0], but got 1) (pytorch)

I have a very simple feed forward neural network (pytorch)

import torch
import torch.nn.functional as F
import numpy as np
class Net_1(nn.Module):
    def __init__(self):
        super(Net_1, self).__init__() 
        self.fc1 = nn.Linear(5*5, 64) 
        self.fc2 = nn.Linear(64, 32)
        self.fc3 = nn.Linear(32, 3)

    def forward(self,x):
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = self.fc3(x)
        return F.log_softmax(x, dim=1)
net = Net_1()

and the input is this 5x5 numpy array

state = [[0, 0, 3, 0, 0],
        [0, 0, 0, 0, 0],
        [0, 2, 1, 0, 0],
        [0, 0, 0, 0, 0],
        [0, 0, 0, 0, 0]]
state = torch.Tensor(state).view(-1)

net(state) throws the following error

Dimension out of range (expected to be in range of [-1, 0], but got 1)

the problem is when F.log_softmax() is applied

Upvotes: 0

Views: 6444

Answers (2)

Megan Hardy
Megan Hardy

Reputation: 397

at the point when you call return F.log_softmax(x, dim=1), x is a 1-dimensional tensor with shape torch.Size([3]).

dimension indexing in pytorch starts at 0, so you cannot use dim=1 for a 1-dimensional tensor, you will need to use dim=0.

replace return F.log_softmax(x, dim=1) with return F.log_softmax(x, dim=0) and you'll be good to go.

in the future you can check tensor sizes by adding print(x.shape) in forward.

Upvotes: 1

Gunner Stone
Gunner Stone

Reputation: 1005

You are giving a 3 element 1d array to your log_softmax function.

When saying dim=1 you are telling it to apply softmax to an axis that doesn't exist.

Just set dim=0 for a 1d array.

More on this function and what that parameter means here

Upvotes: 0

Related Questions