Reputation: 1
I try to follow the guide below on exporting some Object Detection Model (based on Tensorflow Object Detection API) trained with GPU to be use in TPU for Inference,
https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tpu_exporters.md
In one of the requirement, it mentioned:
"Users are assumed to have:
PIPELINE_CONFIG: A pipeline_pb2.TrainEvalPipelineConfig config file"
, but I am unable to find the file pipeline_pb2.TrainEvalPipelineConfig anywhere online or in any repository, may I know how do I get the file?
What is "INPUT_PLACEHOLDER: Name of input placeholder in model's signature_def_map", where can I find it?
What is "INPUT_TYPE: Type of input node, which can be one of 'image_tensor', 'encoded_image_string_tensor', or 'tf_example'"? Where can I find it?
Where can I get an example related to "performing Inference on TPU using object detection model trained on GPU"?
Best regards, Chew Kok Wah
Upvotes: 0
Views: 439
Reputation: 11
The proto is defined at: https://github.com/tensorflow/models/blob/2dfd1e6388c06eb47945f2592d5dad7609172491/research/object_detection/protos/pipeline.proto#L14 You are supposed to use the same config you used in training.
It is in the MetaGraphDef in the SavedModel you exported. SavedModel: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/protobuf/saved_model.proto MetaGraphDef.signature_def: https://github.com/tensorflow/tensorflow/blob/0bf070c54d015ee86de8328d8bdb5582dc0f6d93/tensorflow/core/protobuf/meta_graph.proto#L89
It depends on what type of data you want to feed. Defined at https://github.com/tensorflow/models/blob/2dfd1e6388c06eb47945f2592d5dad7609172491/research/object_detection/exporter.py#L199
You can find an example at https://github.com/tensorflow/models/blob/2dfd1e6388c06eb47945f2592d5dad7609172491/research/object_detection/tpu_exporters/export_saved_model_tpu_lib_test.py#L56 This is basically exporting a TPU SavedModel given a checkpoint and a pipeline config.
Upvotes: 1