Reputation: 1171
I am pretty new to the whole neural network scene, and I was just going through a couple of tutorials on LSTM cells, specifically, tensorflow.
In the tutorial, they have an object tf.nn.rnn_cell.MultiRNNCell
, which from my understanding, is a vertical layering of LSTM cells, similar to layering convolutional networks. However, I couldn't find anything about horizontal LSTM cells, in which the output of one cell is the input of another.
I understand that because the cells are recurrent, they wouldn't need to do this, but I was just trying to see if this is straight out possible.
Cheers!
Upvotes: 1
Views: 447
Reputation: 1
Horizontally stacked is useless in any case I can think of. A common confusion is that there are multiple cells (with different parameters) due to the visualization of the process within an RNN.
RNNs loop over themselves so for every input they generate new input for the cell itself. So they use the same weights over and over. If you would like to separate these connected RNNs and train them on generated sequences (different time steps), I think the weights will descend towards approximately similar parameters. So it will be similar (or equal) to just using one RNN cell using its output as input.
You can use multiple cells kind of 'horizontal' when using it in an encoder decoder model.
Upvotes: 0
Reputation: 36
However, I couldn't find anything about horizontal LSTM cells, in which the output of one cell is the input of another.
This is the definition of recurrence. All RNNs do this.
Upvotes: 1