user3970726
user3970726

Reputation:

Tensorflow: Feeding placeholder from variable

I'm feeding to tensorflow computation(train) graph using input queue and tf.train.batch function that prepares huge tensor with data. I have another queue with test data I would like to feed to graph every 50th step.

Question

Given the form of the input (tensors) do I have to define separate test graph for test data computation or I can somehow reuse train grap?

# Prepare data
batch = tf.train.batch([train_image, train_label], batch_size=200)
batchT = tf.train.batch([test_image, test_label], batch_size=200)

x = tf.reshape(batch[0], [-1, IMG_SIZE, IMG_SIZE, 3])
y_ = batch[1]
xT = tf.reshape(batchT[0], [-1, IMG_SIZE, IMG_SIZE, 3])
y_T = batchT[1]

# Graph definition
train_step = ... # train_step = g(x)

# Session
sess = tf.Session()
sess.run(tf.initialize_all_variables())

for i in range(1000):
  if i%50 == 0: 
  # here i would like reuse train graph but with tensor x replaced by x_t
  # train_accuracy = ?
  # print("step %d, training accuracy %g"%(i, train_accuracy))

train_step.run(session=sess)

I would use placeholders but I can't feed tf.placeholder with tf.Tensors and this is the thing I'm getting from queues. How is it supposed to be done?

I'm really just starting.

Upvotes: 2

Views: 1398

Answers (1)

Alon Burg
Alon Burg

Reputation: 2540

Take a look at how this is done in the MNIST example: You need to use a placeholder with an initializer of the none-tensor form of your data (like filenames, or CSV) and then inside the graph, use the slice_input_producer -> deocde_jpeg (or whatever...) -> tf.train.batch() to create batches and feed those to the computation graph.

So your graph looks like something like:

  • Placeholder initialized with big filenames list/CSV/range
  • tf.slice_input_producer
  • tf.image.decode_jpeg or tf.py_func - loading of the actual data
  • tf.train.batch - create mini batches for training
  • feed to your model

Upvotes: 2

Related Questions