user10129585
user10129585

Reputation: 51

how to save tensorflow model with tf.estimator

I have the following example code to train and evaluate a cnn mnist model using tensorflow's estimator api:

 def model_fn(features, labels, mode):
        images = tf.reshape(features, [-1, 28, 28, 1])
        model = Model()
        logits = model(images)

        predicted_logit = tf.argmax(input=logits, axis=1, output_type=tf.int32)

        if mode == tf.estimator.ModeKeys.PREDICT:
            probabilities = tf.nn.softmax(logits)

            predictions = {
                'predicted_logit': predicted_logit,
                'probabilities': probabilities
            }
            return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions)

        else:
            ...

    def mnist_train_and_eval(_):
        train_data, train_labels, eval_data, eval_labels, val_data, val_labels = get_mnist_data()

        # Create a input function to train
        train_input_fn = tf.estimator.inputs.numpy_input_fn(
            x= train_data,
            y=train_labels,
            batch_size=_BATCH_SIZE,
            num_epochs=1,
            shuffle=True)

        # Create a input function to eval
        eval_input_fn = tf.estimator.inputs.numpy_input_fn(
            x= eval_data,
            y=eval_labels,
            batch_size=_BATCH_SIZE,
            num_epochs=1,
            shuffle=False)

        # Create a estimator with model_fn
        image_classifier = tf.estimator.Estimator(model_fn=model_fn, model_dir=_MODEL_DIR)

        # Finally, train and evaluate the model after each epoch
        for _ in range(_NUM_EPOCHS):
            image_classifier.train(input_fn=train_input_fn)
            metrics = image_classifier.evaluate(input_fn=eval_input_fn)

How can I use the estimator.export_savedmodel to save the trained model for later inference? How should I write the serving_input_receiver_fn?

Thank you very much for your help!

Upvotes: 2

Views: 583

Answers (1)

Sharky
Sharky

Reputation: 4543

You create a function with a dictionary of input features. Placeholder should match the shape of your image, with first dimension for batch_size.

def serving_input_receiver_fn():
  x = tf.placeholder(tf.float32, [None, Shape])
  inputs = {'x': x}
  return tf.estimator.export.ServingInputReceiver(features=inputs, receiver_tensors=inputs)

Or you can use TensorServingInputReceiver which doesn't required dict mapping

inputs = tf.placeholder(tf.float32, [None, 32*32*3])
tf.estimator.export.TensorServingInputReceiver(inputs, inputs)

This function returns new instance of ServingInputReceiver, which is passed to export_savedmodel or tf.estimator.FinalExporter

...
image_classifier.export_savedmodel(saved_dir, serving_input_receiver_fn)

Upvotes: 1

Related Questions