Reputation: 126
Is that possible to implement Dynamic RNN with Attention instead of bucketing by TensorFlow ?
If not, how should we implement it?
Many thanks.
Upvotes: 3
Views: 2343
Reputation: 1010
Use tf.contrib.rnn.AttentionCellWrapper.
Example:
cell_with_attention = tf.contrib.rnn.AttentionCellWrapper(cell, attn_length, ...)
outputs, state = tf.nn.dynamic_rnn(cell_with_attention, inputs, ...)
Upvotes: 2