Jibin Mathew
Jibin Mathew

Reputation: 5102

Implement dropout to fully connected layer in PyTorch

How to apply dropout to the following fully connected network in Pytorch:

class NetworkRelu(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(784,128)
        self.fc2 = nn.Linear(128,64)
        self.fc3 = nn.Linear(64,10)


    def forward(self,x):
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = F.softmax(self.fc3(x),dim=1)
        return x

Upvotes: 6

Views: 9514

Answers (1)

Jibin Mathew
Jibin Mathew

Reputation: 5102

Since there is functional code in the forward method, you could use functional dropout, however, it would be better to use nn.Module in __init__() so that the model when set to model.eval() evaluate mode automatically turns off the dropout.

Here is the code to implement dropout:

class NetworkRelu(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(784,128)
        self.fc2 = nn.Linear(128,64)
        self.fc3 = nn.Linear(64,10)
        self.dropout = nn.Dropout(p=0.5)

    def forward(self,x):
        x = self.dropout(F.relu(self.fc1(x)))
        x = self.dropout(F.relu(self.fc2(x)))
        x = F.softmax(self.fc3(x),dim=1)
        return x

Upvotes: 13

Related Questions