Reputation: 1880
I am a bit confused by tf.local_variables_initializer
. I am not sure when to call it.
I see in other peoples code where they just slather on calls like
init_op = tf.group(tf.global_variables_initializer(),
tf.local_variables_initializer())
Should init_op
just be called in every TF program? Is this "lazy" TF style of programming?
As an example, consider queues like input_string_producer
specified as :
tf.input_string_producer(file_list, num_epochs=None)
does not require tf.local_variables_initializer()
However, when specifying actual number of epochs
tf.input_string_producer(file_list,num_epochs=1)
does in fact require that tf.local_variables_initializer()
is executed.
How does the programmer know when (or when not) to initialise such "hidden" local variables? Shouldn't the FIFOQueue
init its own variables or have something like queue.initializer?
Upvotes: 3
Views: 665
Reputation: 57893
For local tensorflow jobs, you need to initialize both local and global. For distributed TensorFlow jobs, the chief worker initializes local+global, whereas remaining workers initialize only local.
Upvotes: 1