Joe Hidakatsu
Joe Hidakatsu

Reputation: 464

How to serve a tensorflow model using docker image tensorflow/serving when there are custom ops?

I'm trying to use the tf-sentencepiece operation in my model found here https://github.com/google/sentencepiece/tree/master/tensorflow

There is no issue building the model and getting a saved_model.pb file with variables and assets. However, if I try to use the docker image for tensorflow/serving, it says

Loading servable: {name: model version: 1} failed: 
Not found: Op type not registered 'SentencepieceEncodeSparse' in binary running on 0ccbcd3998d1. 
Make sure the Op and Kernel are registered in the binary running in this process. 
Note that if you are loading a saved graph which used ops from tf.contrib, accessing 
(e.g.) `tf.contrib.resampler` should be done before importing the graph, 
as contrib ops are lazily registered when the module is first accessed.

I am unfamiliar with how to build anything manually, and was hoping that I could do this without many changes.

Upvotes: 1

Views: 1507

Answers (1)

Gautam Vasudevan
Gautam Vasudevan

Reputation: 406

One approach would be to:

  1. Pull a docker development image

    $ docker pull tensorflow/serving:latest-devel

  2. In the container, make your code changes

    $ docker run -it tensorflow/serving:latest-devel

Modify the code to add the op dependency here.

  1. In the container, build TensorFlow Serving

    container:$ tensorflow_serving/model_servers:tensorflow_model_server && cp bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server /usr/local/bin/

  2. Use the exit command to exit the container

  3. Look up the container ID:

    $ docker ps

  4. Use that container ID to commit the development image:

    $ docker commit $USER/tf-serving-devel-custom-op

  5. Now build a serving container using the development container as the source

    $ mkdir /tmp/tfserving

    $ cd /tmp/tfserving

    $ git clone https://github.com/tensorflow/serving .

    $ docker build -t $USER/tensorflow-serving --build-arg TF_SERVING_BUILD_IMAGE=$USER/tf-serving-devel-custom-op -f tensorflow_serving/tools/docker/Dockerfile .

  6. You can now use $USER/tensorflow-serving to serve your image following the Docker instructions

Upvotes: 1

Related Questions