Vince.Bdn
Vince.Bdn

Reputation: 1175

TensorFlow + cloud-ml : deploy custom native op / reader

I was wondering if it was possible to deploy Tensorflow custom ops or custom reader written in C++ inside cloud-ml.

It looks like cloud-ml does not accept running native code in its standard mode (I'm not really interested in using a virtualized environment), at least for Python package they only accept pure python with no C dependency.

Upvotes: 2

Views: 163

Answers (1)

Chris Meyers
Chris Meyers

Reputation: 1426

Likely the easiest way to do this is to include as an extra package the build of the entire custom Tensorflow Wheel that includes the op. For specifying extra packages see: https://cloud.google.com/ml-engine/docs/how-tos/packaging-trainer#to_include_custom_dependencies_with_your_package For building a TF wheel from source see: https://www.tensorflow.org/install/install_sources#build_the_pip_package

You could also try to download/install just the .so file for the new op, but that would require either downloading it inside the setup.py of your training package or inside the training python code itself.

Note that you can currently only upload custom packages during Training, and not during Batch or Online Prediction, so a model trained using a custom TF version may not work with the prediction service.

Upvotes: 1

Related Questions