Reputation: 3649
In my code, I am changing my current default graph for some reason and rebuild all the computation logic from scratch. This naturally leads to some errors, since my tf.placeholder
items stay in the old graph. I can naturally declare these again, but in order to do that, I have to write a lot of glue-junk code which will needlesly complicate everything. What I need is to get all tf.placeholder
objects in my current graph and then transfer to them into the new graph I am going to create. Is there any way to do that? My preliminary research did not give any meaningful results but I am highly positive there should be a way to do this in Tensorflow.
Upvotes: 0
Views: 96
Reputation: 27042
If you have 2 graphs, you can copy operations and variables from a graph to the other using the tf.contrib.copy_graph
module.
In particular, you can use the copy_op_to_graph
to copy the placeholder pl
from graph g1
to graph g2
:
tf.contrib.copy_graph.copy_op_to_graph(
pl,
g2,
[],
scope=''
)
Upvotes: 3