user1157751
user1157751

Reputation: 2457

Recurrent Neural Network (RNN) - Forget Layer, and TensorFlow

I'm new to RNN, and I'm trying to figure out the specifics of LSTM cells and they're relation to TensorFlow: Colah GitHub enter image description here Does the GitHub website's example uses the same LSTM cell compared to TensorFlow? The only thing I got on the TensorFlow site was that basic LSTM cells uses the following architecture: Paper If it's the same architecture then I can hand compute the numbers for a LSTM cell and see if it matches.

Also when we set a basic LSTM cell in tensorflow, it takes in a num_units according to: TensorFlow documentation

tf.nn.rnn_cell.GRUCell.__init__(num_units, input_size=None, activation=tanh)

Is this number of hidden state (h_t)) and cell state (C_t)?

According to the GitHub website, there isn't any mention the number of cell state and hidden states. I'm assuming they have to be the same number?

Upvotes: 8

Views: 533

Answers (1)

hurturk
hurturk

Reputation: 5454

Implementation looks the same as GRUCell class doc also points the same paper (specifically for gated) with link given in Colah's article. Parameter num_units is the number of cells (assuming that is the hidden layer) corresponds to output_size due property definition.

Upvotes: 4

Related Questions