Reputation: 767
import tensorflow as tf
import numpy as np
x_tf = tf.placeholder('float',[None, 2, 5, 1])
x_np = np.random.noraml(0,1,[1,2,5,1])
# ======== filter option1 and option2 ===========
f_np = np.random.normal(0,1,[1,3,1,1])
f_tf = tf.constant(f_np,'float') # option 1
f_tf = tf.random_normal([1,3,1,1]) # option 2
# ===============================================
x_conv = tf.nn.conv2d(x_tf,f_tf,[1,1,1,1],'SAME')
with tf.Session() as sess:
tf.global_variables_initializer().run()
x_conv_np = sess.run(x_conv, feed_dict={x_tf: x_np})
x_conv_np2 = sess.run(x_conv, feed_dict={x_tf: x_np})
If I run the code above with option1, I get the same values for x_conv_np
and x_conv_np2
However, when I run the above with option2, I get different values for x_conv_np
and x_conv_np2
.
I am guessing the tf.random_normal gets initialized every time the session is ran.
Is this meant to happen?
This happens even if I do the tf.set_random_seed
Can someone explain how TensorFlow initializes its random variables when the session is ran?
Upvotes: 2
Views: 1229
Reputation: 126154
All of the random number ops in TensorFlow (including tf.random_normal()
) sample a new random tensor each time they runs:
TensorFlow has several ops that create random tensors with different distributions. The random ops are stateful, and create new random values each time they are evaluated.
If you want to sample the distribution once and then re-use the result, you should use a tf.Variable
and initialize it by running tf.random_normal()
once. For example, the following code will print the same random value twice:
f_tf = tf.Variable(tf.random_normal([1, 3, 1, 1]))
# ...
init_op = tf.global_variables_initializer()
# ...
with tf.Session() as sess:
sess.run(init_op)
print(sess.run(f_tf))
print(sess.run(f_tf))
Upvotes: 3