anubhavashok
anubhavashok

Reputation: 532

Storing tensorflow models in memory

The program I'm writing involves switching between models during run-time.

I am currently using Saver to save/load models from the disk as specified here: https://www.tensorflow.org/api_docs/python/state_ops/saving_and_restoring_variables#Saver.

The models are fairly small and can be stored in memory, so I was wondering if anyone knows of a way to store and restore these models in-memory instead of saving them to disk.

I tried to modify the tensorflow source to save the model to memory however gen_io_ops seems to be generated during compile time. Another possible way is to use memory mapped files. Does anyone know of an easier way?

Upvotes: 19

Views: 2576

Answers (1)

Jacob Holloway
Jacob Holloway

Reputation: 887

I would just have two different sessions with their own computation graphs. Alternatively, you could just duplicate the computation graph (two copies of the variables, operations, etc) in the same session. Then you would call sess.run(comp1 if useCompOne else comp2), however you'd like to set it up.

Upvotes: 1

Related Questions