Reputation: 1089
I am working through the Tensorflow serving_basic example at:
https://tensorflow.github.io/serving/serving_basic
Following: https://tensorflow.github.io/serving/setup#prerequisites
Within a docker container based off of ubuntu:latest, I have installed:
echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee /etc/apt/sources.list.d/bazel.list
curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key
sudo apt-get update && sudo apt-get install bazel
sudo apt-get upgrade bazel
pip install grpcio
sudo apt-get update && sudo apt-get install -y build-essential curl libcurl3-dev git libfreetype6-dev libpng12-dev libzmq3-dev pkg-config python-dev python-numpy python-pip software-properties-common swig zip zlib1g-dev
git clone --recurse-submodules https://github.com/tensorflow/serving
cd serving
cd tensorflow
./configure
cd ..
I've built the source with bazel and all tests ran successfully:
bazel build tensorflow_serving/...
bazel test tensorflow_serving/...
I can successfully export the mnist model with:
bazel-bin/tensorflow_serving/example/mnist_export /tmp/mnist_model
And I can serve the exported model with:
bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/
When I test the server and try to connect a client to the model server with:
bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000
I see this output:
root@dc3ea7993fa9:~/serving# bazel-bin/tensorflow_serving/example/mnist_client --num_tests=2 --server=localhost:9000
Extracting /tmp/train-images-idx3-ubyte.gz
Extracting /tmp/train-labels-idx1-ubyte.gz
Extracting /tmp/t10k-images-idx3-ubyte.gz
Extracting /tmp/t10k-labels-idx1-ubyte.gz
AbortionError(code=StatusCode.NOT_FOUND, details="FeedInputs: unable to find feed output images")
AbortionError(code=StatusCode.NOT_FOUND, details="FeedInputs: unable to find feed output images")
Inference error rate is: 100.0%
Upvotes: 1
Views: 1317
Reputation: 1089
I mentioned this on the tensorflow github, and the solution was to remove the original model that had been created. If you're running into this, run
rm -rf /tmp/mnist_model
and rebuild it
Upvotes: 0
Reputation: 43
The "--use_saved_model" model flag is set to default "true"; use the --use_saved_model=false when starting the server. This should work:
bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --use_saved_model=false --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/
Upvotes: 1