Starchand
Starchand

Reputation: 714

Upload SavedModel to ML engine

I'm trying to upload my saved model to ML engine so I can consume my model online, however I am getting the below error:

enter image description here

I am using tensorflow version 1.5 locally to train my model, based on the Tensorflow for poets tutorial (https://codelabs.developers.google.com/codelabs/tensorflow-for-poets/).

I am then converting my model using the below 'save_model.py' script:

import tensorflow as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants
from tensorflow.python.saved_model import builder as saved_model_builder

input_graph = 'retrained_graph.pb'
saved_model_dir = 'my_model'

with tf.Graph().as_default() as graph:
  # Read in the export graph
  with tf.gfile.FastGFile(input_graph, 'rb') as f:
      graph_def = tf.GraphDef()
      graph_def.ParseFromString(f.read())
      tf.import_graph_def(graph_def, name='')

  # Define SavedModel Signature (inputs and outputs)
  in_image = graph.get_tensor_by_name('DecodeJpeg/contents:0')
  inputs = {'image_bytes': tf.saved_model.utils.build_tensor_info(in_image)}

  out_classes = graph.get_tensor_by_name('final_result:0')
  outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)}

  signature = tf.saved_model.signature_def_utils.build_signature_def(
      inputs=inputs,
      outputs=outputs,
      method_name='tensorflow/serving/predict'
  )

  with tf.Session(graph=graph) as sess:
    # Save out the SavedModel.
    b = saved_model_builder.SavedModelBuilder(saved_model_dir)
    b.add_meta_graph_and_variables(sess,
                               [tf.saved_model.tag_constants.SERVING],
                               signature_def_map={'serving_default': signature})
    b.save() 

The error message saying please use runtime 1.2 or above is talking about tensorflow? Or is my save_model.py doing something incorrectly?

Upvotes: 2

Views: 507

Answers (1)

rhaertel80
rhaertel80

Reputation: 8379

You will need to use gcloud to deploy your model. The console does not let you manually specify the runtime version (i.e. it assumes TensorFlow 1.0). Further note that 1.5 is not yet available but will be very soon. That said, your model might work with 1.4, so it's worth a try.

The command to run is:

gcloud ml-engine versions create --model mymodel --origin=gs://mybucket --runtime-version 1.4

And in the near future you can use --runtime-version 1.5.

For more info, see the reference docs, particular the gcloud examples.

Upvotes: 3

Related Questions