AWhite
AWhite

Reputation: 75

PyTorch: Sigmoid of weights?

I'm new to neural networks/PyTorch. I'm trying to make a net that takes in a vector x, first layer is h_j = w_j^T * x + b_j, output is max_j{h_j}. The only thing is that I want the w_j to be restricted between 0 and 1, by having w_j = S(k*a_j), where S is the sigmoid function, k is some constant, and a_j are the actual weight variables (w_j is just a function of a_j). How do I do this in PyTorch? I can't just use a torch.nn.Linear layer, there has to be something else/additional to add in the sigmoid function on the weights?

Side question, for that last output layer, can I just use torch.max to get the max of the previous layer's outputs? Does that behave nicely, or is there some torch.nn.Max or some pooling stuff that I don't understand that needs to happen?

Thanks!

Upvotes: 1

Views: 1401

Answers (1)

Umang Gupta
Umang Gupta

Reputation: 16490

I am really not sure why would you do that but you can declare a custom layer as below to apply sigmoid to weights.

class NewLayer(nn.Module): 
    def __init__ (self, input_size, output_size): 
        super().__init__() 
        self.W = nn.Parameter(torch.zeros(input_size, output_size)) 
        # kaiming initialization (use any other if you like)
        self.W = nn.init.kaiming_normal_(self.W) 
        self.b = nn.Parameter(torch.ones(output_size)) 
    def forward(self, x): 
        # applying sigmoid to weights and getting results 
        ret = torch.addmm(self.b, x, torch.sigmoid(self.W)) 
        return ret 

Once you do this, you can use this as you would use linear layer in your code.

Upvotes: 1

Related Questions