Oriol Moreno
Oriol Moreno

Reputation: 13

Can't init the weights of my neural network PyTorch

I can't initialize the weights with the function MyNet.apply(init_weights).

These are my functions:

def init_weights(net):
    if type(net) == torch.nn.Module:
        torch.nn.init.kaiming_uniform_(net.weight)
        net.bias.data.fill_(0.01)  # tots els bias a 0.01

My neural net is the following:

class NeuralNet(torch.nn.Module):
    def __init__(self):
        super().__init__() # Necessary for torch to detect this class as trainable
        # Here define network architecture
        self.layer1 = torch.nn.Linear(28**2, 32).to(device) # Linear layer with 32 neurons
        self.layer2 = torch.nn.Linear(32, 64).to(device) # Linear layer with 64 neurons
        self.layer3 = torch.nn.Linear(64, 128).to(device)  # Linear layer with 128 neurons
        self.output = torch.nn.Linear(128, 1).to(device) # Linear layer with 1 output neuron (binary output)




    def forward(self, x):
        # Here define architecture behavior
        x = torch.sigmoid(self.layer1(x)).to(device) # x = torch.nn.functional.relu(self.layer1(x))
        x = torch.sigmoid(self.layer2(x)).to(device)  
        x = torch.sigmoid(self.layer3(x)).to(device)

        return torch.sigmoid(self.output(x)).to(device) # Binary output


The type(net) prints as linear so it never gets inside the if statement, and if I remove it produces the following error:

AttributeError: 'NeuralNet' object has no attribute 'weight'

Upvotes: 1

Views: 3423

Answers (1)

Shai
Shai

Reputation: 114796

You should init only the weight of the linear layers:

def init_weights(net):
    if type(net) == torch.nn.Linear:
        torch.nn.init.kaiming_uniform_(net.weight)
        net.bias.data.fill_(0.01)  # tots els bias a 0.01

Upvotes: 4

Related Questions