JPabloFuenzalida
JPabloFuenzalida

Reputation: 81

How to load a trained model saved by export_inference_graph.py?

I'm folowing an example that uses tensorflow's 1.15.0 object detection API. The tutorial goes clearly on the following aspects:

What I have not been able to accomplish, however, is loading the saved model to use it. I tryed with tf.saved_model.loader.load(sess, flags, export_dir, but I get

INFO:tensorflow:Saver not created because there are no variables in the graph to restore.
INFO:tensorflow:The specified SavedModel has no variables; no checkpoints were restored.

the folder given in export_dir has the following structure:

+dir
   +saved_model
      -saved_model.pb
   -model.ckpt.data-00000-of-00001
   -model.ckpt.index
   -checkpoint
   -frozen_inference_graph.pb
   -model.ckpt.meta
   -pipeline.config

My final goal here is to capture images with a camera, and feed them to the net for real time object detection.\ As an in between step, now I just want to be able to feed a single picture and get the output. I was able to train the net, but now I can't use it.

Thank you in advance.

Upvotes: 1

Views: 2471

Answers (1)

JPabloFuenzalida
JPabloFuenzalida

Reputation: 81

I found an example on how to download a model that let me go through it.\ Since the folder format of the file that is downloaded in the example is the same I get on my code, I just had to adapt it.

The orifinal function that downloads the model is

def load_model(model_name):
  base_url = 'http://download.tensorflow.org/models/object_detection/'
  model_file = model_name + '.tar.gz'
  model_dir = tf.keras.utils.get_file(
    fname=model_name, 
    origin=base_url + model_file,
    untar=True)

  model_dir = pathlib.Path(model_dir)/"saved_model"

  model = tf.saved_model.load(str(model_dir))
  model = model.signatures['serving_default']

  return model

Then I used that function to create this new one

def load_local_model(model_path):
  model_dir = pathlib.Path(model_path)/"saved_model"

  model = tf.saved_model.load(str(model_dir))
  model = model.signatures['serving_default']

  return model

At first this didn't worked, since tf.saved_model.load expected 3 arguments, but that was solved by importing the two import blocks on the same example, I stll dont know wich import did the trick and why (I'll edit this answer when I get it), but for the moment this code works and the example lets do more things.

The import blocks are the following

import numpy as np
import os
import six.moves.urllib as urllib
import sys
import tarfile
import tensorflow as tf
import zipfile

from collections import defaultdict
from io import StringIO
from matplotlib import pyplot as plt
from PIL import Image
from IPython.display import display

and

from object_detection.utils import ops as utils_ops
from object_detection.utils import label_map_util
from object_detection.utils import visualization_utils as vis_util

EDIT What was really needed for this to work was the following block.

import os
import pathlib


if "models" in pathlib.Path.cwd().parts:
  while "models" in pathlib.Path.cwd().parts:
    os.chdir('..')
elif not pathlib.Path('models').exists():
  !git clone --depth 1 https://github.com/tensorflow/models

%%bash
cd models/research/
protoc object_detection/protos/*.proto --python_out=.

%%bash 
cd models/research
pip install .

Otherwhise this import block won't work

from object_detection.utils import ops as utils_ops
from object_detection.utils import label_map_util
from object_detection.utils import visualization_utils as vis_util

Upvotes: 2

Related Questions