Reputation: 301
Produced below is an excerpt from my code:
and I get this error despite the fact that I even tried tf.initialize_all_variables()
Can I know, WHY THE VARIABLES INSIDE FUNCTION [linear_layer] ARE NOT INITIALIZED?
Upvotes: 0
Views: 70
Reputation: 4495
You should explicitly create the variable such as out
to let tensorflow know which graph element is evaluated. In your original code, you haven't built the graph when you call tf.global_variables_initializer()
. That's why W
is not initialized.
def linear_layer(input, units):
W = tf.Variable(initial_value=glorot(shape=(input.get_shape().as_list()[1], units)), name="W")
B = tf.Variable(initial_value=tf.zeros(shape=(input.get_shape().as_list()[0], 1)), name="B")
out = tf.matmul(input, W) + B
return out
out = linear_layer(input=tf.constant([[1.,2.,3.],[4.,5.,6.]]), units=10)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
sess.run(tf.local_variables_initializer())
print(sess.run(out))
# [[ 0.8285629 0.7860288 1.8736962 0.4321289 -0.9692887 -1.638855
# -0.19338632 0.5580156 -0.13394058 1.6745124 ]
# [ 1.9110355 1.2211521 3.2454844 -0.9029484 -2.0184612 -2.753471
# -0.29346204 0.340119 0.04118478 2.893313 ]]
Upvotes: 1