Reputation: 2985
I might be wrong, but it’s my understanding that as long as I’m using the same session, it doesn’t matter how many times I build a graph, the variables will be the same, so if I train them for one graph, the next time within the same session I build the graph, the variables will already be trained. Am I wrong? Perhaps I need to use scopes? Or do I need to manually copy the variables between graphs?
Upvotes: 2
Views: 173
Reputation: 15119
tf.make_template()
may be what you are looking for. It wraps a provided set of tensor operations so that it does variable sharing.
Upvotes: 1