grt1st
grt1st

Reputation: 21

Can we feed a value without defining tf.placeholder?

Here is the code https://github.com/tensorflow/tensorflow/blob/r0.10/tensorflow/models/rnn/ptb/ptb_word_lm.py,I wonder why we can feed the model by:

cost, state, _ = session.run([m.cost, m.final_state, eval_op],
                              {m.input_data: x,
                               m.targets: y,
                               m.initial_state: state})

Because initial_state isn't with tf.placeholder,so how can we feed it?

In the code, it defines a class. And defines self._initial_state = cell.zero_state(batch_size, data_type()), then state = self._initial_state, and (cell_output, state) = cell(inputs[:, time_step, :], state). After that, self._final_state = state. What's more, it defines a function in the class:

@property def final_state(self): return self._final_state

And here comes

state = m.initial_state.eval()
cost, state, _ = session.run([m.cost, m.final_state, eval_op],
                             {m.input_data: x,
                              m.targets: y,
                              m.initial_state: state})

And I have ran the code locally, it quite have the difference without state in feeddict.

Can anyone help?

Upvotes: 1

Views: 86

Answers (1)

Ishant Mrinal
Ishant Mrinal

Reputation: 4918

Because initial_state isn't with tf.placeholder,so how can we feed it?

Placeholder is just a tensorflow tensor; we can feed any tensor using feed_dict mechanism,

And I have ran the code locally, it quite have the difference without state in feed_dict.

Here if you don't feed the learned state from previous batch, state will be initialized as zero for the next batch, hence there will be degradation of results.

Upvotes: 2

Related Questions