EffePi
EffePi

Reputation: 376

Deploy a model on ml-engine, exporting with tf.train.Saver()

I want to deploy a model on the new version of Google ML Engine. Previously, with Google ML, I could export my trained model creating a tf.train.Saver(), saving the model with saver.save(session, output).

So far I've not been able to find out if the exported model obtained this way is still deployable on ml-engine, or else I must follow the training procedure described here and create a new trainer package and necessarily train my model with ml-engine.

Can I still use tf.train.Saver() to obtain the model I will deploy on ml-engine?

Upvotes: 4

Views: 1080

Answers (1)

Nikhil Kothari
Nikhil Kothari

Reputation: 5225

tf.train.Saver() only produces a checkpoint.

Cloud ML Engine uses a SavedModel, produced from these APIs: https://www.tensorflow.org/versions/master/api_docs/python/tf/saved_model?hl=bn

A saved model is a checkpoint + a serialized protobuf containing one or more graph definitions + a set of signatures declaring the inputs and outputs of the graph/model + additional asset files if applicable, so that all of these can be used at serving time.

I suggest looking at couple of examples:

  1. The census sample - https://github.com/GoogleCloudPlatform/cloudml-samples/blob/master/census/tensorflowcore/trainer/task.py#L334

  2. And my own sample/library code - https://github.com/TensorLab/tensorfx/blob/master/src/training/_hooks.py#L208 that calls into https://github.com/TensorLab/tensorfx/blob/master/src/prediction/_model.py#L66 to demonstrate how to use a checkpoint, load it into a session and then produce a savedmodel.

Hope these pointers help adapt your existing code to produce a model to now produce a SavedModel.

I think you also asked another similar question to convert a previously exported model, and I'll link to it here for completeness for anyone else: Deploy retrained inception SavedModel to google cloud ml engine

Upvotes: 4

Related Questions