Phúc Lê
Phúc Lê

Reputation: 2635

Tensorflow: What is the difference between declaring a variable with tf.Variable and declaring directly?

I'm writing a small sample program of adding and subtracting two number using Tensorflow. However I have received different result for the case if I declare variable as tf.Variable or when I declare it directly. So I wonder what is the difference between the 2 ways as I sense that there is a fundamental piece of knowledge about TF that I haven't known which drove me toward the bug. Here are the code:

x= tf.Variable(tf.random_uniform([], minval= -1, maxval= 1))
y= tf.Variable(tf.random_uniform([], minval= -1, maxval= 1))
#declare 2 tf variables with initial value randomly generated.

# A 'case' statement to perform either addition or subtraction
out = tf.case({tf.less(x, y): lambda: tf.add(x, y), tf.greater(x, y): lambda: tf.subtract(x, y)}, default= lambda: tf.zeros([]), exclusive= True)

#Run the graph
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    x_value = x.eval()
    y_value = y.eval()
    print("{}\n".format(x_value - y_value))
    out_value, sub_value, add_value = sess.run([out, subtf, addtf])

#Example output: x_value = 0.53607559
                 y_value = -0.63836479
                 add_value = -0.1022892
                 sub_value = 1.1744404
                 out_value = 1.1744404

As you see, the case statement works right, the operations are ok. However, if I omit the tf.Variable from the declaration of x and y , thing go wild:

x= tf.random_uniform([], minval= -1, maxval= 1)
y= tf.random_uniform([], minval= -1, maxval= 1)
.... All the same as above

#Sample answer run on Spyder: x_value = -0.91663623
                              y_value = -0.80014014
                              add_value = 0.26550484 , should be =-1.71677637
                              sub_value = -0.19451094, should be -0.11649609
                              out_value = 0.26550484, , should be =-1.71677637

As you see, the case statement and the operation still perform consistently, but the answer is wrong. I don't understand why the answers are different?

Upvotes: 0

Views: 370

Answers (1)

jasekp
jasekp

Reputation: 1010

When you declare a variable as in

x_var = tf.Variable(tf.random_uniform([], minval= -1, maxval= 1))

The random values are stored in the variable unless changed by an assignment operation.

Alternatively, declaring

x_op = tf.random_uniform([], minval= -1, maxval= 1)

defines an operation that generates a new random number each time it is called.

For example:

# After calling
sess.run(tf.global_variables_initializer())

sess.run(x_var) # Will return the same randomly generated value every time
sess.run(x_op) # Will return a different random value every time

I hope this helps explain why the second version of the code behaves differently.

Upvotes: 2

Related Questions