Reputation: 23
How can I serve model with tensorflow-serving, if there are tf.contrib operations. I use Tensorflow Serving via Docker (latest) (version of tf 1.11) and when I serve model there is the next message:
“Failed to start server. Error: Unknown: 1 servable(s) did not become available: {{{name: slider_universal version: 1} due to error: Not found: Op type not registered ‘ImageProjectiveTransformV2’ in binary running on 254345a5d9f1. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) tf.contrib.resampler should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.}, }”
I also built with bazel but there was the same error
I use tf.contrib.image.transform
If I delete this operation during exporting model it can be served by tensorflow serving
Upvotes: 2
Views: 1198
Reputation: 1055
I had problems with the same op and it seems like the only way is to build it.
tensorflow_serving/model_servers/BUILD
defines which Tensorflow ops will be included in variable SUPPORTED_TENSORFLOW_OPS
, and I was a bit confused about this since it specifies that contrib ops should be included. However, since the tensorflow contrib build rule doesn't seem to include the ops under contrib.image
so instead I explicitly added these by updating this variable to the following
SUPPORTED_TENSORFLOW_OPS = [
"@org_tensorflow//tensorflow/contrib:contrib_kernels",
"@org_tensorflow//tensorflow/contrib:contrib_ops_op_lib",
"@org_tensorflow//tensorflow/contrib/image:image_ops_kernels",
"@org_tensorflow//tensorflow/contrib/image:image_ops_op_lib",
]
Upvotes: 2