Shadow Walker
Shadow Walker

Reputation: 1199

PyTorch RuntimeError: Tensor for argument #1 'self' is on CPU, but expected them to be on GPU

I'm using PyTorch for my Logistic Regression model but whenever I run the model summary I get an error

RuntimeError: Tensor for 'out' is on CPU, Tensor for argument #1 'self' is on CPU, but expected them to be on GPU (while checking arguments for addmm)

Code

# Convert data to tensors
X_train = torch.Tensor(X_train)
y_train = torch.LongTensor(y_train)
X_test = torch.Tensor(X_test)
y_test = torch.LongTensor(y_test)

class LogisticRegression(nn.Module):
    def __init__(self, input_features, num_classes):
        super(LogisticRegression, self).__init__()
        self.fc1 = nn.Linear(input_dim, num_classes)
        
    def forward(self, x_in, apply_softmax = False):
        y_pred = self.fc1(x_in)
        if apply_softmax:
            y_pred = F.softmax(y_pred, dim = 1)
        return y_pred

INPUT_DIM = X_train.shape[1]
NUM_CLASSES = len(y_train.unique())

model = LogisticRegression(input_features = INPUT_DIM, num_classes = NUM_CLASSES)
print(model.named_parameters)
summary(model, input_size=(INPUT_DIM,))

My way does not work as expected, how do I go about fixing the problem?

Upvotes: 2

Views: 6617

Answers (2)

cbare
cbare

Reputation: 12468

I had the same error.

RuntimeError: Tensor for 'out' is on CPU, Tensor for argument #1 'self' is on CPU, but expected them to be on GPU (while checking arguments for addmm)

Ensuring the model and its weights were on the GPU helped:

model.to(device)

where device is defined:

device = "cuda" if torch.cuda.is_available() else "cpu"
print(f"Using {device} device")

Upvotes: 2

Ivan
Ivan

Reputation: 40678

If you're using a notebook, then you most likely must have mixed up the variables. As input_dim is not defined in your snippet but might have been previously in your scope.

Assuming you are using summary from torchsummary, you don't need data to infer the model's structure, only the input shape. The following will work:

class LogisticRegression(nn.Module):
    def __init__(self, input_features, num_classes):
        super(LogisticRegression, self).__init__()
        self.fc1 = nn.Linear(input_features, num_classes) # <- was input_dim
        
    def forward(self, x_in, apply_softmax = False):
        y_pred = self.fc1(x_in)
        if apply_softmax:
            y_pred = F.softmax(y_pred, dim = 1)
        return y_pred

INPUT_DIM = 10
NUM_CLASSES = 100

model = LogisticRegression(input_features=INPUT_DIM, num_classes=NUM_CLASSES)
summary(model, input_size=(INPUT_DIM,))

Upvotes: 0

Related Questions