Reputation: 1960
I have one question about random variables in TensorFlow. Let's suppose I need a random variable inside my loss function. In TensorFlow tutorials I find random functions used for initialize variables, like weights that in a second time are modified by training process. In my case I need a random vector of floats (let's say 128 values), that follows a particular distribution (uniform or Gaussian) but that can change in each loss calculation.
Defining this variable in my loss function, is this the simple thing that I need to do, since at each epoch I get new values (that anyway follow the selected distribution) or do I get that the values that are always the same in all the iterations?
Upvotes: 0
Views: 1903
Reputation: 27042
If you assign the values randomly generated to a Variable
then this value will remain fixed until you update this variable.
If you, instead, put in the loss function directly the "generation" (tf.random_*
) of the numbers, then they'll be different at each call.
Just try this out:
import tensorflow as tf
# generator
x = tf.random_uniform((3,1), minval=0, maxval=10)
# variable
a = tf.get_variable("a", shape=(3,1), dtype=tf.float32)
# assignment
b = tf.assign(a, x)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
for i in range(5):
# 5 different values
print(sess.run(x))
# assign the value
sess.run(b)
for i in range(5):
# 5 equal values
print(sess.run(a))
Upvotes: 1
Reputation: 24581
A random node in TensorFlow always takes a different value each time it is called, as you can verify by calling it several times
import tensorflow as tf
x = tf.random_uniform(shape=())
sess = tf.Session()
sess.run(x)
# 0.79877698
sess.run(x)
# 0.76016617
It is not a Variable
in the tensorflow terminology, as you can check from the code above, which runs without calling variable initialization.
Upvotes: 3