m_squared
m_squared

Reputation: 225

Are multiple instantiations of a graph created when two TensorFlow sessions are created?

Background

I'm learning TensorFlow by walking through part II of Hands-On Machine Learning with Scikit-Learn and TensorFlow, and one of the exercise questions is (bolded text my own addition to add clarity to question):

"If you create a graph g containing a variable w, then start two threads and open a session in each thread, both using the same graph g, will each session have its own copy of the variable w or will it be shared?"

The answer provided in the back of the book is below:

"In local TensorFlow, session manage variable values, so if you create a graph g containing variable w, then start two threads and open a local session in each thread, both using the same graph g, then each session will have its own copy of the variable w..."

My Question

Is A) or B) the correct interpretation of what is happening?

A). Both sessions are using the same instance of the graph g, and the two separate variable are due to only the two separate sessions.

B) With the instantiation of two distinct sessions, the two threads use the same architecture of the graph g, but two separate instantiations of graph g are created, leading to two distinct variables.

Upvotes: 2

Views: 441

Answers (2)

javidcf
javidcf

Reputation: 59691

I think the problem here is the ambiguity of the term "instantiation", and what the graph and the session actually are.

If you have one graph and two open sessions for it, there is just one instance of the graph. This is a Python object that describes the computation performed by your model, namely, operations between tensors possibly including some constant values and variables. If you add a new element to that graph (a new operation), it will be accessible by both sessions. It is important to understand that graphs are static, that is, they have no state, and they do not compute anything as such, they just describe how the computation will be performed. It can be thought of as an analog to the source code of a computer program.

The session is an object that stores a state for the graph which can execute computations on it. This "state" contains, most importantly, the value of the variables in the graph. So, the variable object itself is part of the graph (and, in that sense, "shared" across sessions), but the value it has at any time is stored in each open session. Variable values are not the only thing stored within the session, though. You also have things like the status of random number generators or dataset iterators. See What is a “stateful object” in tensorflow?. Following the analogy before, the session would be something like the memory and CPU used by the execution of a program for which the graph is the source code.

Trying to answer your question more specifically, the correct interpretation would be A), I think, if I am understanding what you mean correctly.

Upvotes: 2

gorjan
gorjan

Reputation: 5555

Let's check to see what happens:

tf.reset_default_graph()
w = tf.Variable(tf.random_uniform([2,3]), name="w")
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    w1 = sess.run(w)
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    w2 = sess.run(w)

assert np.testing.assert_equal(w1, w2)

And we get an assertion error. Which means that B) is the correct answer.

Upvotes: 0

Related Questions