user3742631
user3742631

Reputation: 143

how to serve pytorch or sklearn models using tensorflow serving

I have found tutorials and posts which only says to serve tensorflow models using tensor serving. In model.conf file, there is a parameter model_platform in which tensorflow or any other platform can be mentioned. But how, do we export other platform models in tensorflow way so that it can be loaded by tensorflow serving.

Upvotes: 9

Views: 4839

Answers (3)

dasilvadaniel
dasilvadaniel

Reputation: 463

Now you can serve your scikit-learn model with Tensorflow Extended (TFX): https://www.tensorflow.org/tfx/guide/non_tf

Upvotes: 1

hyttysmyrkky
hyttysmyrkky

Reputation: 23

Not answering the question, but since no better answers exist yet: As an addition to the alternative directions by adrin, these might be helpful:

Upvotes: 1

adrin
adrin

Reputation: 4896

I'm not sure if you can. The tensorflow platform is designed to be flexible, but if you really want to use it, you'd probably need to implement a C++ library to load your saved model (in protobuf) and give a serveable to tensorflow serving platform. Here's a similar question.

I haven't seen such an implementation, and the efforts I've seen usually go towards two other directions:

  1. Pure python code serving a model over HTTP or GRPC for instance. Such as what's being developed in Pipeline.AI
  2. Dump the model in PMML format, and serve it with a java code.

Upvotes: 1

Related Questions