Reputation: 2123
I am trying to convert a TF 2.0 saved_model to tensorRT on the Jetson Nano.
The model was saved in TF 2.0.0. The nano has Jetpack 4.2.2 w/ TensorRT __ and Tensorflow 1.14 (that is the latest Tensorflow release for Jetson).
I have been following the instuctions from here which describe how to convert a TF 2.0.0 saved_model into TensorRT.
Below is my code:
import tensorflow as tf
from tensorflow.python.compiler.tensorrt import trt_convert as trt
tf.enable_eager_execution()
converter = trt.TrtGraphConverterV2(input_saved_model_dir=input_saved_model_dir)
converter.convert()
converter.save(output_saved_model_dir)
saved_model_loaded = tf.saved_model.load(
output_saved_model_dir, tags=[tag_constants.SERVING])
graph_func = saved_model_loaded.signatures[
signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
frozen_func = convert_to_constants.convert_variables_to_constants_v2(
graph_func)
def wrap_func(*args, **kwargs):
# Assumes frozen_func has one output tensor
return frozen_func(*args, **kwargs)[0]
output = wrap_func(input_data).numpy()
It seems to start converting successfully. However I get an KeyError: 'serving_default'
error when it reaches the convert_to_tensor
line. My complete printout is below found here (too long for SO), but the python traceback appears below. How can I fix this?
Thanks!
printout summary (complete printout here):
Traceback (most recent call last):
File "tst.py", line 38, in <module>
convert_savedmodel()
File "tst.py", line 24, in convert_savedmodel
converter.convert()
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/compiler/tensorrt/trt_convert.py", line 956, in convert
func = self._saved_model.signatures[self._input_saved_model_signature_key]
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/signature_serialization.py", line 196, in __getitem__
return self._signatures[key]
KeyError: 'serving_default'
Upvotes: 0
Views: 1590
Reputation: 147
I can see two problems in your experiment:
You are using TF-TRT 2.0 API while having TF 1.14 installed. That is not supported. If you have TF 1.14 installed on your system, then you would need to use TF-TRT 1.x API.
TF Models saved in TF2.0 are not compatible with TF1.14 according to https://www.tensorflow.org/guide/versions
If you only have access to TF1.14, I suggest to re-generate the graph in TF1.14 and save the model there before applying TF-TRT, and then use TF-TRT 1.x API.
Upvotes: 1