Reputation: 26532
Could anyone explain me what I'm doing wrong that my tensorboard graphs have additional groups when I use tf.layers.conv1d
?
For sake of simplicity I've created one tf.name_scope
'conv_block1' that contains: conv1d -> max_pool -> batch_norm
, yet my graph has odd addtional blocks (see attached screenshot). Basically a superficial block 'conv1dwas added with weights for the
conv_block1/conv1d` layer, and it is placed an groups. This makes the networks with multiple convolution blocks completely unreadable, am I doing something wrong or is this some kind of bug/performance feature in Tensorflow 1.4? Odd enough the dense layers are fine and the weights are properly scoped.
Here is the code if anyone wants to recreate the graph:
def cnn_model(inputs, mode):
x = tf.placeholder_with_default(inputs['wav'], shape=[None, SAMPLE_RATE, 1], name='input_placeholder')
with tf.name_scope("conv_block1"):
x = tf.layers.conv1d(x, filters=80, kernel_size=5, strides=1, padding='same', activation=tf.nn.relu)
x = tf.layers.max_pooling1d(x, pool_size=3, strides=3)
x = tf.layers.batch_normalization(x, training=(mode == tf.estimator.ModeKeys.TRAIN))
x = tf.layers.flatten(x)
x = tf.layers.dense(x, units=12)
return x
I've added even simpler example that can be executed directly to see the issue:
g = tf.Graph()
with g.as_default():
x = tf.placeholder(name='input', dtype=tf.float32, shape=[None, 16000, 1])
with tf.name_scope('group1'):
x = tf.layers.conv1d(x, 80, 5, name='conv1')
x = tf.layers.dense(x, 10, name="dense1")
[n.name for n in g.as_graph_def().node]
outputs:
['input',
'conv1/kernel/Initializer/random_uniform/shape',
'conv1/kernel/Initializer/random_uniform/min',
'conv1/kernel/Initializer/random_uniform/max',
'conv1/kernel/Initializer/random_uniform/RandomUniform',
'conv1/kernel/Initializer/random_uniform/sub',
'conv1/kernel/Initializer/random_uniform/mul',
'conv1/kernel/Initializer/random_uniform',
'conv1/kernel',
'conv1/kernel/Assign',
'conv1/kernel/read',
'conv1/bias/Initializer/zeros',
'conv1/bias',
'conv1/bias/Assign',
'conv1/bias/read',
'group1/conv1/dilation_rate',
'group1/conv1/conv1d/ExpandDims/dim',
'group1/conv1/conv1d/ExpandDims',
'group1/conv1/conv1d/ExpandDims_1/dim',
'group1/conv1/conv1d/ExpandDims_1',
'group1/conv1/conv1d/Conv2D',
'group1/conv1/conv1d/Squeeze',
'group1/conv1/BiasAdd',
'dense1/kernel/Initializer/random_uniform/shape',
'dense1/kernel/Initializer/random_uniform/min',
'dense1/kernel/Initializer/random_uniform/max',
'dense1/kernel/Initializer/random_uniform/RandomUniform',
'dense1/kernel/Initializer/random_uniform/sub',
'dense1/kernel/Initializer/random_uniform/mul',
'dense1/kernel/Initializer/random_uniform',
'dense1/kernel',
'dense1/kernel/Assign',
'dense1/kernel/read',
'dense1/bias/Initializer/zeros',
'dense1/bias',
'dense1/bias/Assign',
'dense1/bias/read',
'dense1/Tensordot/Shape',
'dense1/Tensordot/Rank',
'dense1/Tensordot/axes',
'dense1/Tensordot/GreaterEqual/y',
'dense1/Tensordot/GreaterEqual',
'dense1/Tensordot/Cast',
'dense1/Tensordot/mul',
'dense1/Tensordot/Less/y',
'dense1/Tensordot/Less',
'dense1/Tensordot/Cast_1',
'dense1/Tensordot/add',
'dense1/Tensordot/mul_1',
'dense1/Tensordot/add_1',
'dense1/Tensordot/range/start',
'dense1/Tensordot/range/delta',
'dense1/Tensordot/range',
'dense1/Tensordot/ListDiff',
'dense1/Tensordot/Gather',
'dense1/Tensordot/Gather_1',
'dense1/Tensordot/Const',
'dense1/Tensordot/Prod',
'dense1/Tensordot/Const_1',
'dense1/Tensordot/Prod_1',
'dense1/Tensordot/concat/axis',
'dense1/Tensordot/concat',
'dense1/Tensordot/concat_1/axis',
'dense1/Tensordot/concat_1',
'dense1/Tensordot/stack',
'dense1/Tensordot/transpose',
'dense1/Tensordot/Reshape',
'dense1/Tensordot/transpose_1/perm',
'dense1/Tensordot/transpose_1',
'dense1/Tensordot/Reshape_1/shape',
'dense1/Tensordot/Reshape_1',
'dense1/Tensordot/MatMul',
'dense1/Tensordot/Const_2',
'dense1/Tensordot/concat_2/axis',
'dense1/Tensordot/concat_2',
'dense1/Tensordot',
'dense1/BiasAdd']
Upvotes: 1
Views: 320
Reputation: 26532
Ok I've found the issue apparently tf.name_scope
is for operation only and tf.variable_scope
works for both operations and variables (as per this tf issue).
Here is a stack overflow question that explains the difference between name_scope and variable_scope: What's the difference of name scope and a variable scope in tensorflow?
g = tf.Graph()
with g.as_default():
x = tf.placeholder(name='input', dtype=tf.float32, shape=[None, 16000, 1])
with tf.variable_scope('v_scope1'):
x = tf.layers.conv1d(x, 80, 5, name='conv1')
[n.name for n in g.as_graph_def().node]
gives:
['input',
'v_scope1/conv1/kernel/Initializer/random_uniform/shape',
'v_scope1/conv1/kernel/Initializer/random_uniform/min',
'v_scope1/conv1/kernel/Initializer/random_uniform/max',
'v_scope1/conv1/kernel/Initializer/random_uniform/RandomUniform',
'v_scope1/conv1/kernel/Initializer/random_uniform/sub',
'v_scope1/conv1/kernel/Initializer/random_uniform/mul',
'v_scope1/conv1/kernel/Initializer/random_uniform',
'v_scope1/conv1/kernel',
'v_scope1/conv1/kernel/Assign',
'v_scope1/conv1/kernel/read',
'v_scope1/conv1/bias/Initializer/zeros',
'v_scope1/conv1/bias',
'v_scope1/conv1/bias/Assign',
'v_scope1/conv1/bias/read',
'v_scope1/conv1/dilation_rate',
'v_scope1/conv1/conv1d/ExpandDims/dim',
'v_scope1/conv1/conv1d/ExpandDims',
'v_scope1/conv1/conv1d/ExpandDims_1/dim',
'v_scope1/conv1/conv1d/ExpandDims_1',
'v_scope1/conv1/conv1d/Conv2D',
'v_scope1/conv1/conv1d/Squeeze',
'v_scope1/conv1/BiasAdd']
Upvotes: 1