Reputation: 2045
Say I have two saved models one from tensorflow 1.8 and the other from tensorflow 2.2. Serving both of those could run into compatibility issues.
Would it be possible to serve both of those in the same tensorflow/serving binary ?
My intuition suggests NO one cannot, at least not easily.
I am not an expert in bazel files but I presume compiling tensorflow/serving needs to build and link the tensorflow core lib. I am not sure whether one could link together two different versions of the tensorflow core library together.
I guess one could compile the tensorflow/serving binary in two different release points 1.8.0 and also 2.2.0 and deploy both of those binaries in your infrastructure separately. Then one needs to manage at the model discovery layer and request routing layer about which model needs to be loaded in which tensorflow/serving binary and also which predict request should talk to which tensorflow/serving endpoint.
Upvotes: 0
Views: 318
Reputation: 1343
I'm definitely not an expert on the deep inner workings of TensorFlow, so take this with a grain of salt. But I think what you want to do may actually be pretty easy.
My very approximate (and possibly completely incorrect) understanding is that the TensorFlow APIs are a sort of wrapper that creates a graph representing whatever computation you'd like to do, and that the compiled graph is cross-compatible between at least some versions, even if the APIs used to create and manipulate it aren't.
Empirically, I've been able to take models built with TensorFlow 1.15.x and put them into TensorFlow Serving on 2.3.0 with absolutely no problems at all.
Upvotes: 0