jojo
jojo

Reputation: 66

When do I have to initialize variables in Tensorflow?

I am trying to understand tensor flow and how understand it one has to first create ops and variables add them to the graph and then in a session those operations will be performed. Then why in this piece of code I don't have to use the method initialize_all_variables()? I was trying to add init = tf.initialize_all_variables() and then sess.run(init) but it was wrong. Why is this working without the initialization??

import tensorflow as tf 
import numpy as np
x = tf.placeholder('float', [2,3])
y = x*2
z = tf.Variable([[1,1,1],[1,1,1]], name = "z")
with tf.Session() as sess:
    x_data = np.arange(1,7).reshape((2,3))
    z.assign(x_data)
    res = sess.run(y, feed_dict = {x:x_data})
    print(res.dtype, z.dtype, z.get_shape())`

Upvotes: 1

Views: 514

Answers (1)

Yaroslav Bulatov
Yaroslav Bulatov

Reputation: 57893

You are not allowed to read an uninitialized value. In the case above you are not reading z hence you don't need to initialize it.

If you look at at variables.py you see that initialize_all_variables is a group node connected to all initializers

def initialize_variables(var_list, name="init"):
...

 return control_flow_ops.group(
        *[v.initializer for v in var_list], name=name)

Looking at z.initializer, you can see it's an Assign node. So evaluating tf.initialize_all_variables in TensorFlow is the same as doing session.run on z.assign(...

Upvotes: 1

Related Questions