Dadeslam
Dadeslam

Reputation: 201

Define nn.parameters with a for loop

I am interested in defining L weights in a custom neural network with Pytorch.

If L is known it is not a problem to define them one by one, but if L is not known I want to use a for loop to define them. My idea is to do something like this (which does not work)

class Network(nn.Module):
    def __init__(self, ):
        super(Network, self).__init__()
        self.nl = nn.ReLU()

        for i in range(L):
          namew = 'weight'+str(i)
          self.namew = torch.nn.Parameter(data=torch.Tensor(2,2), requires_grad=True)

This should do something like this (which instead works but is limited to a specific number of weights):

class Network(nn.Module):
    def __init__(self, ):
        super(Network, self).__init__()
        self.nl = nn.ReLU()

        self.weight1 = torch.nn.Parameter(data=torch.Tensor(2,2), requires_grad=True)
        self.weight2 = torch.nn.Parameter(data=torch.Tensor(2,2), requires_grad=True)
        self.weight3 = torch.nn.Parameter(data=torch.Tensor(2,2), requires_grad=True)

With what I tried to do, there is the problem that instead of a "dynamic" string for 'namew', Pytorch recognizes just the string 'namew'. Therefore instead of L weights, just 1 weight is defined.

Is there some way to solve this problem?

Upvotes: 0

Views: 880

Answers (2)

DerekG
DerekG

Reputation: 3958

Best way to accomplish this You can accomplish this by using a ParameterDict or a ModuleDict (for nn.module layers):

class Network(nn.Module):
        def __init__(self):
            super(Network, self).__init__()
            self.nl = nn.ReLU()
            
            # define some nn.module layers
            self.layers = nn.ModuleDict()
            for i in range(L):
                 self.layers("layer{}".format(i) = torch.nn.Linear(i-1,i)
      
            # define some non-module layers
            self.weights = torch.nn.ParameterDict()
            for i in range(L):
                 self.weights["weights{}".format(i)] = torch.nn.Parameter(data=torch.Tensor(2,2), requires_grad=True)

Upvotes: 1

Dadeslam
Dadeslam

Reputation: 201

I solved the problem with this line of code, in place of the for loop:

self.weights = nn.ParameterList([nn.Parameter(torch.randn(2, 2)) for i in range(L)])

Upvotes: 0

Related Questions