Max Power
Max Power

Reputation: 992

Loop within tf.Session()

In my TensorFlow project I'd like to try different optimizers and ideally I'd loop over them, so I can compare the training in TensorBoard. I've extracted the optimization bit because that fails. I have something like this:

import os
import tensorflow as tf

eta = 0.1
num_epochs = 100

xv = tf.Variable(0.0)
sv = tf.Variable(0, trainable=False)

loss = xv * xv - 4.0 * xv + 5.0

optimizers = [tf.train.GradientDescentOptimizer(eta),
              tf.train.AdagradOptimizer(eta),
              tf.train.AdamOptimizer(eta)]

init = tf.global_variables_initializer()

saver = tf.train.Saver()

summary_op = tf.summary.scalar('x', xv)
writer = tf.summary.FileWriter('log', graph=tf.get_default_graph())

with tf.Session() as sess:
    sess.run(init)
    for optimizer in optimizers:
      objective = optimizer.minimize(loss, global_step=sv)
      for epoch in range(num_epochs):
          _, step, result, summary = sess.run([objective, sv, xv, summary_op])
          writer.add_summary(summary, global_step=step)
          writer.flush()
      saver.save(sess, os.getcwd() + '/output')
      print(sess.run(xv))

It fails with Errors may have originated from an input operation. Input Source operations connected to node Adagrad/update_Variable_2/ApplyAdagrad: Variable_2 (defined at <stdin>:1)

Can something like this be done in TensorFlow or am I using the session wrong in that I'm trying to run the same thing from scratch several times within the same session without re-initializing the variables? I'm not sure how to achieve that though. I tried placing the loop outside of tf.Session() but that did not work either.

Upvotes: 2

Views: 485

Answers (1)

Fan Luo
Fan Luo

Reputation: 106

We usually only attach one optimizer to the same sets of variables.

When you call optimizer.minimize(loss), tensorflow automatically choose the variables that affect loss to optimize. Calling 3 times optimizer.minimize(loss) on the same loss and call session.run(...) alternatively is not advisable.

If you hope to compare the 3 optimizer, it would be better use 3 scripts to test each optimizer separately.

Besides, you should call optimizer.minimize(...) for any optimizer only once because repeating call that function only adds new operations to the graph. The actual computing is performed when you call session.run(...)

Upvotes: 1

Related Questions