Reputation: 1111
I am kind of confused why are we using feed_dict
? According to my friend, you commonly use feed_dict
when you use placeholder
, and this is probably something bad for production.
I have seen code like this, in which feed_dict
is not involved:
for j in range(n_batches):
X_batch, Y_batch = mnist.train.next_batch(batch_size)
_, loss_batch = sess.run([optimizer, loss], {X: X_batch, Y:Y_batch})
I have also seen code like this, in which feed_dict
is involved:
for i in range(100):
for x, y in data:
# Session execute optimizer and fetch values of loss
_, l = sess.run([optimizer, loss], feed_dict={X: x, Y:y})
total_loss += l
I understand feed_dict
is that you are feeding in data and try X
as the key as if in the dictionary. But here I don't see any difference. So, what exactly is the difference and why do we need feed_dict
?
Upvotes: 23
Views: 17100
Reputation: 32051
In a tensorflow model you can define a placeholder such as x = tf.placeholder(tf.float32)
, then you will use x
in your model.
For example, I define a simple set of operations as:
x = tf.placeholder(tf.float32)
y = x * 42
Now when I ask tensorflow to compute y
, it's clear that y
depends on x
.
with tf.Session() as sess:
sess.run(y)
This will produce an error because I did not give it a value for x
. In this case, because x
is a placeholder, if it gets used in a computation you must pass it in via feed_dict
. If you don't it's an error.
Let's fix that:
with tf.Session() as sess:
sess.run(y, feed_dict={x: 2})
The result this time will be 84
. Great. Now let's look at a trivial case where feed_dict
is not needed:
x = tf.constant(2)
y = x * 42
Now there are no placeholders (x
is a constant) and so nothing needs to be fed to the model. This works now:
with tf.Session() as sess:
sess.run(y)
Upvotes: 32