Reputation: 2683
I'm moving my comments from https://github.com/tensorflow/tensorflow/issues/8833 to StackOverflow as SO seems more appropriate.
I'm attempting to implement a sequence to sequence model using tensorflow.contrib.seq2seq
and tensorflow.contrib.rnn
's BasicLSTMCell
. Within rnn_cell_impl.py
, the line c, h = state
causes the following error:
TypeError: 'Tensor' object is not iterable.
When stepping through the code, I learned that the error is caused the third time c, h = state
is evaluated. The first two times, state has type <class 'tensorflow.python.ops.rnn_cell_impl.LSTMStateTuple'>
, but on the third time, state has type <class 'tensorflow.python.framework.ops.Tensor'>
. Clearly, I want the third time to have type LSTMStateTuple, but I have no idea what might be causing the switch.
The problematic state
tensor's name is define_model/define_decoder/decoder/while/Identity_3
. I wrote the methods define_model()
and define_decoder()
, and the remaining information suggests that something is happening inside my decoder
.
In case it's relevant, I'm using Python 3.6 and Tensorflow 1.2.
Upvotes: 0
Views: 466
Reputation: 11
I think the similar answer can be found here.
The code converts cudnn cell state to tensorflow internal state.
See this method
def cudnn_lstm_state_to_state_tuples(cudnn_lstm_state):
Upvotes: 0
Reputation: 2683
The answer can be found at the above linked Github issue page.
To briefly summarize, the problem was that my encoder used a bidirectional RNN, which produces a 2-tuple of LSTMStateTuples i.e. one c and one h state for each directional RNN. Then, later, the decoder accepts a single cell, which has associated with it a single LSTMStateTuple. To solve this problem, you need to separately concatenate the c states and h states for the two directional RNNS, wrap this as a new LSTMStateTuple and pass that to the decoder's state.
Upvotes: 1