Thomas
Thomas

Reputation: 715

Serving a trained object detection model with tensorflow serve

I'm having a hard time to serve a tensorflow model, that I've trained from a pretrained model with tensorflow's object detection API.

I've trained a model (Resnet101) with the model_main.py script and performance seems to be ready for production use. Thus, I've created a docker container which runs tensorflow-serve. I've managed to serve the model which was created at the end of the the training process. I guess that feature is quiet new, but it seems, that the model_main.py script creates a servable at the end of training. (I found a new folder called "export" in my "train_dir" which contains a saved_model.pb and the variables variables.data-00000-of-00001 and variables.index). However, I've managed to serve this model and the output form the tensorflow_model_server looks like this:

2018-08-29 07:47:50.268810: I tensorflow_serving/core/loader_harness.cc:86] Successfully loaded servable version {name: my_model version: 123}
2018-08-29 07:47:50.271480: I tensorflow_serving/model_servers/main.cc:327] Running ModelServer at 0.0.0.0:8500 ...

So serving seems to work.

The Porblem is, that I'm struggeling to connect to the server with a python client. I've modified the client file that comes with the tensorflow serve inception example and looks like this:

from __future__ import print_function

# This is a placeholder for a Google-internal import.

import grpc
import tensorflow as tf

from tensorflow_serving.apis import predict_pb2
from tensorflow_serving.apis import prediction_service_pb2_grpc


tf.app.flags.DEFINE_string('server', 'localhost:9000',
                       'PredictionService host:port')
tf.app.flags.DEFINE_string('image', '', 'path to image in JPEG format')
FLAGS = tf.app.flags.FLAGS

def main(_):
  channel = grpc.insecure_channel(FLAGS.server)
  stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)
  # Send request
  with open(FLAGS.image, 'rb') as f:
    # See prediction_service.proto for gRPC request/response details.
    data = f.read()
    request = predict_pb2.PredictRequest()
    request.model_spec.name = 'my_model'
    request.model_spec.signature_name = 'serving_default'
    request.inputs['serialized_example'].CopyFrom(
        tf.contrib.util.make_tensor_proto(data, shape=[1]))
    result = stub.Predict(request, 10.0)  # 10 secs timeout
    print(result)


if __name__ == '__main__':
  tf.app.run()

If I run this script with properly set ports, I get the error message from inside the modle server:

2018-08-29 08:32:48.426016: W external/org_tensorflow/tensorflow/core/framework/op_kernel.cc:1275] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Could not parse example input, value: '����

The client returns some random binary strings. But there is clearly a connection and the request reaches the server.

It seems to me, that something's wrong with the request by the client, but I have no idea how to set it properly. I didn't find any information on the default signature key, that the model_main.py script uses to export a trained model and trying to create a new servable by using training checkpoints and a modified exporter.py script failed.

Does someone know how to set up the client's request properly in this case?

Upvotes: 2

Views: 1091

Answers (2)

Aaron Friedland
Aaron Friedland

Reputation: 48

I ran into the exact same issue while working on my codebase. The solution I found was that the model was exported with the wrong input type. In the exporter.py script, the different options for input are ['image_tensor', 'encoded_image_string_tensor', 'tf_example']. When I exported my model, I had set INPUT_TYPE=image_tensor. After exporting the same model using INPUT_TYPE=encoded_image_string_tensor, the client and server communicated just fine.

Upvotes: 1

Kirill Yashuk
Kirill Yashuk

Reputation: 83

Seems like you are calling the gRPC port not the REST API one. https://www.tensorflow.org/tfx/serving/docker

Port 8500 exposed for gRPC

Port 8501 exposed for the REST API

Upvotes: 1

Related Questions