Joshua Piccari
Joshua Piccari

Reputation: 325

How to load tensorflow graph from memory address

I'm using the TensorFlow C++ API to load a graph from a file and execute it. Everything is working great, but I'd like to load the graph from memory rather than from a file (so that I can embed the graph into the binary for better portability). I have variables that reference both the binary data (as an unsigned char array) and the size of the data.

This how I am currently loading my graph.

GraphDef graph_def;
ReadBinaryProto(tensorflow::Env::Default(), "./graph.pb", &graph_def);

Feels like this should be simple but most of the discussion is about the python API. I did try looking for the source of ReadBinaryProto but wasn't able to find it in the tensorflow repo.

Upvotes: 2

Views: 1452

Answers (1)

ash
ash

Reputation: 6751

The following should work:

GraphDef graph_def;
if (!graph_def.ParseFromArray(data, len)) {
    // Handle error
}
...

This is because GraphDef is a sub-class of google::protobuf::MessageList, and thus inherits a variety of parsing methods

Edit: Caveat: As of January 2017, the snippet above works only when the serialized graph is <64MB because of a default protocol buffer setting. For larger graphs, take inspiration from ReadBinaryProtos implementation

FWIW, the code for ReadBinaryProto is in tensorflow/core/platform/env.cc

Upvotes: 3

Related Questions