jamborta
jamborta

Reputation: 5210

Create a tf.contrib.learn Estimator serving that takes JSON input

I am after some code that I can use to export a model from a tensorflow Estimator that would take JSON as an input. I could make this work with tf.Estimator using tf.estimator.export.ServingInputReceiver, but for models built in tf.contrib.learn I could not find any documentation. There is one example here that creates an export with tf.Example serving, but Example is a bit tricky to construct.

Upvotes: 1

Views: 395

Answers (3)

Kishore Karunakaran
Kishore Karunakaran

Reputation: 598

Check out here for a set of examples which shows how to use tensorflow estimator for Serving models in Cloud ML

Code:

def serving_fn():
receiver_tensor = {
    commons.FEATURE_COL: tf.placeholder(dtype=tf.string, shape=None)
}

features = {
    key: tensor
    for key, tensor in receiver_tensor.items()
}

return tf.estimator.export.ServingInputReceiver(features, receiver_tensor)

Upvotes: 0

Lak
Lak

Reputation: 4166

To use contrib estimator, you have to look at earlier versions of the samples. Here is an example:

https://github.com/GoogleCloudPlatform/training-data-analyst/blob/85c57e4da2e7edeffbb6652636e3c65b313c568f/blogs/babyweight/babyweight/trainer/model.py

Not that you are returning an input function ops. Having said that, I would recommend you to migrate to tf.estimator if you can.

Upvotes: 1

rhaertel80
rhaertel80

Reputation: 8389

There are a few examples in CloudML Engine's sample repository, e.g.this code.

To wit, you create placeholders and pass them to the ServingInputReceiver constructor. The outermost dimension should be 'None' to handle variable sized batches.

def build_receiver():
  x = tf.placeholder(tf.float32, size=[None])
  y = tf.placeholder(tf.int32, size=[None, 128, 128, 3])
  features = {'x': x, 'y': y}
  return tf.estimator.export.ServingInputReceiver(features, features)

Upvotes: 0

Related Questions