Reputation: 55
So for various reasons (such as its language-independence) I want to use tensorflow's saved_model API for saving/loading models. I can save everything (and restore it successfully) with a call to builder.add_meta_graph_and_variables()
at the end of training, but I don't see any way to save periodically. Tensorflow docs on this are very sparse, and the template code they provide (here) doesn't help me:
...
builder = tf.saved_model.builder.SavedModelBuilder(export_dir)
with tf.Session(graph=tf.Graph()) as sess:
...
builder.add_meta_graph_and_variables(sess,
["foo-tag"],
signature_def_map=foo_signatures,
assets_collection=foo_assets)
...
with tf.Session(graph=tf.Graph()) as sess:
...
builder.add_meta_graph(["bar-tag", "baz-tag"])
...
builder.save()
Calling builder.save()
does not save the new variables into the model. It just updates the model protobuf.
What am I missing? How do I save after e.g. the nth epoch using saved_model
?
Upvotes: 2
Views: 456
Reputation: 41
The variables are saved when you call builder.add_meta_graph_and_variables()
, because saver.save()
is called inside it. See here
Solution:
Just call saver.save(sess, export_dir+'/variables/variables', write_meta_graph=False, write_state=False)
before builder.save()
.
Upvotes: 0
Reputation: 55
Well, after looking through the tensorflow code here and elsewhere, it looks like the answer is "you can't". SavedModelBuilder
is really just designed for models outside of the training phase, and it allows you to add metagraphs and choose which sets of variables to load/save (i.e. TRAINING vs. SERVING) but that's it. SavedModelBuilder.add_meta_graph_and_variables
, for example, can be called exactly once, and there is no SavedModelBuilder.update_variables
or anything like that. While training, on the other hand, you need to use the Saver
class and save checkpoints and those associated files. Why there isn't a unified system for this I have no idea but apparently that's the way it is.
Upvotes: 1