Reputation: 2257
I trained a tensorflow model that i'd like to run predictions on from numpy arrays. This is for image processing within videos. I will pass the images to the model as they happen. Not every frame is passed.
I reload my SavedModel within a session like so
def run(self):
with tf.Session(graph=tf.Graph()) as sess:
tf.saved_model.loader.load(sess,
[tf.saved_model.tag_constants.SERVING], "model")
My code works perfectly if I pass a list of images (self.tfimages) to the prediction. Condensed to:
softmax_tensor = sess.graph.get_tensor_by_name('final_ops/softmax:0')
predictions = sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})
But i won't have all the images at once. Do I really have to reload the model from file each time (takes 2+ minutes).
I thought to do something like this
class tensorflow_model:
def __init__(self):
with tf.Session(graph=tf.Graph()) as self.sess:
tf.saved_model.loader.load(self.sess,
[tf.saved_model.tag_constants.SERVING], "model")
def predict(self):
# Feed the image_data as input to the graph and get first prediction
softmax_tensor = self.sess.graph.get_tensor_by_name('final_ops/softmax:0')
predictions = self.sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})
but that yields
builtins.RuntimeError: Attempted to use a closed Session
Is there a way to keep a session open, or perhaps load SavedModel independent of a session?
EDIT I tried the first answer to create a session in two steps:
sess=tf.Session(graph=tf.Graph())
sess
<tensorflow.python.client.session.Session object at 0x0000021ACBB62EF0>
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
Traceback (most recent call last):
Debug Probe, prompt 138, line 1
File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 222, in load
saver.restore(sess, variables_path)
File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\training\saver.py", line 1428, in restore
{self.saver_def.filename_tensor_name: save_path})
File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\client\session.py", line 774, in run
run_metadata_ptr)
File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\client\session.py", line 905, in _run
raise RuntimeError('The Session graph is empty. Add operations to the '
builtins.RuntimeError: The Session graph is empty. Add operations to the graph before calling run().
Whereas
with tf.Session(graph=tf.Graph()) as sess:
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
executes without error.
As for the second idea of passing sess as a variable to class, which is a good one. This works:
with tf.Session(graph=tf.Graph()) as sess:
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
tensorflow_instance=tensorflow(read_from="file")
tensorflow_instance.predict(sess)
But this doesn't
sess=tf.Session(graph=tf.Graph())
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
tensorflow_instance=tensorflow(read_from="file")
tensorflow_instance.predict(sess)
It would be pretty awkward to wrap my program into the with as sess statement.
Full code:
import tensorflow as tf
import sys
from google.protobuf import text_format
from tensorflow.core.framework import graph_pb2
import os
import glob
class tensorflow:
def __init__(self,read_from):
#frames to be analyzed
self.tfimages=[]
find_photos=glob.glob("*.jpg")
# Read in the image_data
if read_from=="file":
for x in find_photos:
image_data = tf.gfile.FastGFile(x, 'rb').read()
self.tfimages.append(image_data)
# Loads label file, strips off carriage return
self.label_lines = [line.rstrip() for line in tf.gfile.GFile("dict.txt")]
def predict(self,sess):
# Feed the image_data as input to the graph and get first prediction
softmax_tensor = sess.graph.get_tensor_by_name('final_ops/softmax:0')
predictions = sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})
for prediction in predictions:
# Sort to show labels of first prediction in order of confidence
top_k = prediction.argsort()[-len(prediction):][::-1]
for node_id in top_k:
human_string = self.label_lines[node_id]
score = prediction[node_id]
print('%s (score = %.5f)' % (human_string, score))
return(human_string)
if __name__ == "__main__":
with tf.Session(graph=tf.Graph()) as sess:
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
tensorflow_instance=tensorflow(read_from="file")
tensorflow_instance.predict(sess)
sess=tf.Session(graph=tf.Graph())
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
tensorflow_instance=tensorflow(read_from="file")
tensorflow_instance.predict(sess)
Upvotes: 16
Views: 13426
Reputation: 8389
Others have explained why you can't put your session in a with
statement in the constructor.
The reason you see different behavior when using the context manager vs. not is because tf.saved_model.loader.load
has some weird interactions between the default graph and the graph that is part of the session.
The solution is simple; don't pass a graph to session if you're not using it in a with
block:
sess=tf.Session()
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
Here's some example code for a class to do predictions:
class Model(object):
def __init__(self, model_path):
# Note, if you don't want to leak this, you'll want to turn Model into
# a context manager. In practice, you probably don't have to worry
# about it.
self.session = tf.Session()
tf.saved_model.loader.load(
self.session,
[tf.saved_model.tag_constants.SERVING],
model_path)
self.softmax_tensor = self.session.graph.get_tensor_by_name('final_ops/softmax:0')
def predict(self, images):
predictions = self.session.run(self.softmax, {'Placeholder:0': images})
# TODO: convert to human-friendly labels
return predictions
images = [tf.gfile.FastGFile(f, 'rb').read() for f in glob.glob("*.jpg")]
model = Model('model_path')
print(model.predict(images))
# Alternatively (uses less memory, but has lower throughput):
for f in glob.glob("*.jpg"):
print(model.predict([tf.gfile.FastGFile(f, 'rb').read()]))
Upvotes: 9
Reputation: 222701
Your code does not work because in your init function you open the session and close it. So there is no session after init finishes.
If you want to make many predictions after your model has been trained, I recommend you not to reinvent the wheel and use the tool, TF developers created for this reason: TF serving.
TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs. TensorFlow Serving provides out-of-the-box integration with TensorFlow models, but can be easily extended to serve other types of models and data
They have a lot of tutorials starting from the very basic ones and spending a day on learning a few things will save you months later.
Upvotes: 0
Reputation: 642
Your code creates a scope which is exited after it leaves init.
def __init__(self):
with tf.Session(graph=tf.Graph()) as self.sess:
tf.saved_model.loader.load(self.sess[tf.saved_model.tag_constants.SERVING], "model")
The following should work for you if you have everything else working properly.
def __init__(self):
self.sess=tf.Session(graph=tf.Graph())
tf.saved_model.loader.load(self.sess[tf.saved_model.tag_constants.SERVING], "model")
When I do something like this I also usually create the option of passing the session to the class by a parameter, then when I call the class I pass in a session create by with
Upvotes: 1