Reputation: 335
I write a function using tensorflow ops. I know the fact when I run the function, it will add many ops to the graph. But I am confused with how to get access of these ops.
for example:
def assign_weights():
with tf.name_scope('zheng'):
v = tf.Variable(0, 'v', dtype=tf.float32)
b = tf.placeholder(tf.float32, shape=())
z = tf.assign(v, b)
return z, b
I can use feed_dict
to pass a
value to b
, only if I set b
as a return value. Otherwise, there is no way to access b
. If we want to access many ops in the function scope, we should set many return values. This is very ugly.
I want to know what happens under the hood when I run functions using tensorflow and how to get access of the ops in the function scope.
Thank you!
Upvotes: 1
Views: 95
Reputation: 419
3 things happen when you use op function:
for example, a = tf.add(b, c, name='add')
,
Add
to default graph, with name 'add'So you can access nodes via sess.graph
, there are many functions to access nodes, say, get_operation_by_name.
Also, you can operate the graph via sess.graph_def
, which is serialized graph with protobuf, you can find the protobuf definition in the tensorflow source code, tensorflow/core/framework
, some .proto files there.
Upvotes: 0
Reputation: 1199
Obviously, it's true that to access an op (or tensor) we need some reference to it. IMHO, one standard workaround is to build your graph in a class and make certain tensors attributes of the class and access them through the object.
Alternatively, if you're more inclined to the functional approach, a better way than returning all relevant ops and tensors separately would be to return a dict (or namedtuple).
Additionally, there are also specialized functions that return ops by name: e.g. get_operation_by_name
.
As an aside to this question, you might also want to try out eager execution, which is imperative.
Upvotes: 1