pgmcr
pgmcr

Reputation: 79

How to add hidden neurons to a Pytorch RNN

How can I add hidden neurons to a Recurrent Neural Network in pytorch? In my understanding, torch.nn.RNN has n neurons with inputs being the input and the hidden state, where n is equal to the size of the hidden state.

How can I add additional layers before the neurons go back to the hidden state? E.g. If i only have 1 input and 1 output, but want to be able to model more complex functions?

I tried using the num_layers parameter but this just adds more layers of single neurons. I also tried using torch.nn.Sequential to stack individual RNNs with different sized inputs/outputs but this didnt work as Sequential objects dont seem to pass through additional parameters (h0, the initial hidden state).

Im trying to model f(x)=sin(x) with the initial hidden state beign the initial value of the sinewave (sin(x_0)), the inputs being x and the outputs being sin(x).

Upvotes: 0

Views: 1365

Answers (1)

Ahx
Ahx

Reputation: 7995

You can't define rnn without defining hidden neurons.

Lets look at the official example:

class RNNTutorial(Module):
    def __init__(self, input_size, hidden_size,
                 output_size):
        super(RNNTutorial, self).__init__()
        self.hidden_size = hidden_size
        size_sum = input_size + hidden_size
        self.i2h = Linear(size_sum, hidden_size)
        self.i2o = Linear(size_sum, output_size)
        self.softmax = LogSoftmax(dim=1)

    def forward(self, input_, hidden_):
        combined = cat(tensors=(input_, hidden_), dim=1)
        hidden_ = self.i2h(input=combined)
        hidden_ = relu(hidden_)
        output = self.i2o(input=combined)
        output = self.softmax(input=output)
        return output, hidden_

    def init_hidden(self):
        return zeros(1, self.hidden_size)

Above is a two-layer RNN structure. On the 1st layer

self.i2h = Linear(size_sum, hidden_size)

The hidden neuron input size: size_sum and the output hidden_size

how to add neuron? You can change the parameter values.

For instance: size_sum + 1 now you add one more hidden neuron.

Upvotes: 1

Related Questions