Sycorax
Sycorax

Reputation: 1386

Generative sequences in Tensorflow

I'm using the following code as a way to generate a sequence of length num_steps given starting_point and starting_state using an instance of a RNNCell, for example cell=GRUCell(number_of_neurons).

outputs = [starting_point]
state = starting_state
for time_step in range(num_steps):
    if time_step > 0: tf.get_variable_scope().reuse_variables()
    (cell_output, state) = cell(outputs[time_step], state)
    outputs.append(cell_output)

But this is slow and cumbersome for my use case, where num_steps = 1000. Even instantiating the graph takes forever.

Does this functionality exist somewhere in Tensroflow, and I just missed it?

Note that what I'm looking for is similar to, but distinct from, the behavior of tf.contrib.rnn.static_rnn. The documentation summarizes the behavior of this function as simply applying the RNN to each time step in a sequence:

state = cell.zero_state(...)
  outputs = []
  for input_ in inputs:
    output, state = cell(input_, state)
    outputs.append(output)
  return (outputs, state)

But in my case, I want to feed the output from one step as the input to the next step.

Upvotes: 1

Views: 545

Answers (1)

Eugene Brevdo
Eugene Brevdo

Reputation: 899

In the tensorflow nightly builds, see tf.contrin.seq2seq for the dynamic decoder objects. You can use the scheduled sampling helpers to do what you want. Alternatively use tf.nn.dynamic_rnn and feed all zeros as inputs. The lstm h state is also the lstm output so you get essentially the same behavior you want.

Upvotes: 2

Related Questions