amityadav
amityadav

Reputation: 194

How to write serving input function for Tensorflow model trained without using Estimators?

I have a model trained on a single machine without using Estimator and I'm looking to serve the final trained model on Google cloud AI platform (ML engine). I exported the frozen graph as a SavedModel using SavedModelBuilder and deployed it on the AI platform. It works fine for small input images but for it to be able to accept large input images for online prediction, I need to change it to accept b64 encoded strings ({'image_bytes': {'b64': base64.b64encode(jpeg_data).decode()}}) which are converted to the required tensor by a serving_input_fn if using Estimators.

What options do I have if I am not using an Estimator? If I have a frozen graph or SavedModel being created from SavedModelBuilder, is there a way to have something similar to an estimator's serving_input_fn when exporting/ saving?

Here's the code I'm using for exporting:

from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants

export_dir = 'serving_model/'
graph_pb = 'model.pb'

builder = tf.saved_model.builder.SavedModelBuilder(export_dir)

with tf.gfile.GFile(graph_pb, "rb") as f:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(f.read())

sigs = {}

with tf.Session(graph=tf.Graph()) as sess:
    # name="" is important to ensure we don't get spurious prefixing
    tf.import_graph_def(graph_def, name="")
    g = tf.get_default_graph()

    inp = g.get_tensor_by_name("image_bytes:0")
    out_f1 = g.get_tensor_by_name("feature_1:0")
    out_f2 = g.get_tensor_by_name("feature_2:0")

    sigs[signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
        tf.saved_model.signature_def_utils.predict_signature_def(
            {"image_bytes": inp}, {"f1": out_f1, "f2": out_f2})

    builder.add_meta_graph_and_variables(sess,
                                         [tag_constants.SERVING],
                                         strip_default_attrs=True,
                                         signature_def_map=sigs)

builder.save()

Upvotes: 3

Views: 930

Answers (2)

AMT
AMT

Reputation: 1026

First, load your already exported SavedModel with

import tensorflow as tf
loaded_model = tf.saved_model.load(MODEL_DIR)

Then, wrap it with a new Keras model that takes base64 input

class Base64WrapperModel(tf.keras.Model):
  def __init__(self, model):
    super(Base64WrapperModel, self).__init__()
    self.inner_model = model

  @tf.function
  def call(self, base64_input):
    str_input = tf.io.decode_base64(base64_input)
    return self.inner_model(str_input)

wrapper_model = Base64WrapperModel(loaded_model)

Finally, save your wrapped model with Keras API

wrapper_model.save(EXPORT_DIR)

Upvotes: 0

Lak
Lak

Reputation: 4166

Use a @tf.function to specify a serving signature. Here's an example that calls Keras:

class ExportModel(tf.keras.Model):
    def __init__(self, model):
        super().__init__(self)
        self.model = model

    @tf.function(input_signature=[
        tf.TensorSpec([None,], dtype='int32', name='a'),
        tf.TensorSpec([None,], dtype='int32', name='b')
    ])
    def serving_fn(self, a, b):
        return {
            'pred' : self.model({'a': a, 'b': b}) #, steps=1)
        }

    def save(self, export_path):
        sigs = {
            'serving_default' : self.serving_fn
        }
        tf.keras.backend.set_learning_phase(0) # inference only
        tf.saved_model.save(self, export_path, signatures=sigs)

sm = ExportModel(model)
sm.save(EXPORT_PATH)

Upvotes: 3

Related Questions