Reputation: 583
I am using TensorFlow v2.3.0. I would like to convert a model I saved as HDF5 with a .h5 extension to protocol buffers (.pb file). There are existing solutions that I have tried but do not work directly because they were written for TensorFlow v1.x.x. So I tried editing the code to make them compatible with TensorFlow v2.3.0 and I ended up with this code:
import tensorflow as tf
from tensorflow.keras.models import load_model
from tensorflow.compat.v1.keras.backend import get_session
from tensorflow.python.platform import gfile
from tensorflow.compat.v1 import global_variables
from tensorflow.compat.v1.graph_util import convert_variables_to_constants as c_to_c
model = load_model('models/model-v2.h5')
# print(model.summary())
Model: "functional_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 150, 150, 3)] 0
_________________________________________________________________
conv2d (Conv2D) (None, 148, 148, 16) 448
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 74, 74, 16) 0
_________________________________________________________________
conv2d_1 (Conv2D) (None, 72, 72, 32) 4640
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 36, 36, 32) 0
_________________________________________________________________
conv2d_2 (Conv2D) (None, 34, 34, 64) 18496
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 17, 17, 64) 0
_________________________________________________________________
flatten (Flatten) (None, 18496) 0
_________________________________________________________________
dense (Dense) (None, 512) 9470464
_________________________________________________________________
dropout (Dropout) (None, 512) 0
_________________________________________________________________
dense_1 (Dense) (None, 1) 513
=================================================================
Total params: 9,494,561
Trainable params: 9,494,561
Non-trainable params: 0
# print(model.outputs)
# [<tf.Tensor 'dense_1/Sigmoid:0' shape=(None, 1) dtype=float32>]
# print(model.inputs)
# [<tf.Tensor 'input_1:0' shape=(None, 150, 150, 3) dtype=float32>]
def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):
graph = session.graph
with graph.as_default():
freeze_var_names = list(set(v.op.name for v in global_variables()).difference(keep_var_names or []))
output_names = output_names or []
output_names += [v.op.name for v in global_variables()]
# Graph -> GraphDef ProtoBuf
input_graph_def = graph.as_graph_def()
if clear_devices:
print(input_graph_def.node)
for node in input_graph_def.node:
print('Node',node)
node.device = ""
frozen_graph = c_to_c(session, input_graph_def, output_names, freeze_var_names)
return frozen_graph
frozen_graph = freeze_session(tf.compat.v1.Session(),
output_names=[out.op.name for out in model.outputs])
# Save to model/model.pb
tf.io.write_graph(frozen_graph, "models", "model_v2.pb", as_text=False)
I am however encountering this error: AssertionError: dense_1/Sigmoid is not in graph. Apparently the graph_def is returning an empty list because nothing is being printed as I loop through the nodes. I would like to know how to correct this code or if there is a better alternative to converting a .h5 model to .pb
Upvotes: 1
Views: 1474
Reputation: 583
After a lot of googling, I found the correct(compatible with TensorFlow 2.x.x) code here: https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/
# Convert Keras model to ConcreteFunction
full_model = tf.function(lambda x: model(x))
full_model = full_model.get_concrete_function(
x=tf.TensorSpec(model.inputs[0].shape, model.inputs[0].dtype))
# Get frozen ConcreteFunction
frozen_func = convert_variables_to_constants_v2(full_model)
frozen_func.graph.as_graph_def()
layers = [op.name for op in frozen_func.graph.get_operations()]
print("-" * 50)
print("Frozen model layers: ")
for layer in layers:
print(layer)
print("-" * 50)
print("Frozen model inputs: ")
print(frozen_func.inputs)
print("Frozen model outputs: ")
print(frozen_func.outputs)
# Save frozen graph from frozen ConcreteFunction to hard drive
tf.io.write_graph(graph_or_graph_def=frozen_func.graph,
logdir="./frozen_models",
name="simple_frozen_graph.pb",
as_text=False)
The code used if from this file: https://github.com/leimao/Frozen_Graph_TensorFlow/blob/master/TensorFlow_v2/example_1.py
Upvotes: 2
Reputation: 861
Try to use tf.saved_model.save in compatibility mode:
tf.keras.Model instances constructed from inputs and outputs already have a signature and so do not require a @tf.function decorator or a signatures argument. If neither are specified, the model's forward pass is exported.
x = input_layer.Input((4,), name="x")
y = core.Dense(5, name="out")(x)
model = training.Model(x, y)
tf.compat.v1.saved_model.save(model, '/tmp/saved_model/')
# The exported SavedModel takes "x" with shape [None, 4] and returns "out"
# with shape [None, 5]
P.S. Have not opportunity to test, let me know about result please if you going to try it.
Upvotes: 0