Reputation: 805
I have a question about not shown tensor. I coded as below, and I expected there are tensors having name as 'ttt', 'l1a' or 'l1_maxpool', but there aren't when seeing by tf.all_variables function. But seeing by toggling a breakpoint at the position, they exist.
Is there reason why they are not shown or another reason that I should modify the code? Thanks in advance.
import tensorflow as tf
def init_weights(shape, name):
return tf.Variable(tf.random_normal(shape, stddev=0.01), name=name)
X = tf.placeholder("float", [None, 28, 28, 1])
Y = tf.placeholder("float", [None, 10])
w = init_weights([3, 3, 1, 32], 'w')
w2 = init_weights([3, 3, 32, 64], 'w2')
w3 = init_weights([3, 3, 64, 128], 'w3')
w4 = init_weights([128 * 4 * 4, 625], 'w4')
w_o = init_weights([625, 10], 'w_o')
ttt = tf.nn.conv2d(X, w, strides=[1, 1, 1, 1], padding='SAME', name='ttt')
l1a = tf.nn.relu(ttt, name='l1a')
l1 = tf.nn.max_pool(l1a, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='SAME', name='l1_maxpool')
with tf.Session() as sess:
tf.initialize_all_variables().run()
z_ttt = tf.get_default_graph().get_tensor_by_name(ttt.name)
z_l1 = tf.get_default_graph().get_tensor_by_name(l1.name)
tensors = tf.all_variables()
for k in range(len(tensors)):
print tensors[k].name
kk = 0;
Upvotes: 0
Views: 31
Reputation: 4451
There is a difference between variables and operations. Convolution, relu and max_pool are operations you perform on variables. If you visualise your graph (https://www.tensorflow.org/how_tos/graph_viz/) you can see the operations you added are indeed in there!
Hope this helps!
Upvotes: 1