Vibhor Kanojia
Vibhor Kanojia

Reputation: 417

Why do we use tf.name_scope()

I've been reading the tutorials on TensorFlow where they have written

with tf.name_scope('read_inputs') as scope:
    # something

The example

a = tf.constant(5)

and

with tf.name_scope('s1') as scope:
    a = tf.constant(5)

seem to have the same effect. So, why do we use name_scope?

Upvotes: 32

Views: 17925

Answers (3)

Soph
Soph

Reputation: 973

I don't see the use case for reusing constants but here is some relevant information on scopes and variable sharing.

Scopes

  • name_scope will add scope as a prefix to all operations

  • variable_scope will add scope as a prefix to all variables and operations

Instantiating Variables

  • tf.Variable() constructer prefixes variable name with current name_scope and variable_scope

  • tf.get_variable() constructor ignores name_scope and only prefixes name with the current variable_scope

For example:

with tf.variable_scope("variable_scope"):
     with tf.name_scope("name_scope"):
         var1 = tf.get_variable("var1", [1])

with tf.variable_scope("variable_scope"):
     with tf.name_scope("name_scope"):
         var2 = tf.Variable([1], name="var2")

Produces

var1 = <tf.Variable 'variable_scope/var1:0' shape=(1,) dtype=float32_ref>

var2 = <tf.Variable 'variable_scope/name_scope/var2:0' shape=(1,) dtype=string_ref>

Reusing Variables

  • Always use tf.variable_scope to define the scope of a shared variable

  • The easiest way to do reuse variables is to use the reuse_variables() as shown below

with tf.variable_scope("scope"):
    var1 = tf.get_variable("variable1",[1])
    tf.get_variable_scope().reuse_variables()
    var2=tf.get_variable("variable1",[1])
assert var1 == var2
  • tf.Variable() always creates a new variable, when a variable is constructed with an already used name it just appends _1, _2 etc. to it - which can cause conflicts :(

Upvotes: 11

Bs He
Bs He

Reputation: 747

I will try to use some loose but easy-understanding language to explain.

name scope

usually used to group some variables together in an op. That is, it gives you an explanation on how many variables are included in this op. However, for these variables, their existence is not considered. You just know, OK, to complete this op, I need to prepare this, this and this variables. Actually, in using tensorboard, it helps you bind variables together so your plot won't be messy.

variable scope

think about this as a drawer. Compared with name space, this is of more "physical" meaning, because such drawer truly exists; in the contrary, name space just helps understand which variables are included.

Since variable space "physically" exists, so it constrains that since this variable is already there, you can't redefine it again and if you want to use them multiple times, you need to indicate reuse.

Upvotes: 0

etarion
etarion

Reputation: 17159

They are not the same thing.

import tensorflow as tf
c1 = tf.constant(42)
with tf.name_scope('s1'):
    c2 = tf.constant(42)
print(c1.name)
print(c2.name)

prints

Const:0
s1/Const:0

So as the name suggests, the scope functions create a scope for the names of the ops you create inside. This has an effect on how you refer to tensors, on reuse, on how the graph shows in TensorBoard and so on.

Upvotes: 27

Related Questions