Reputation: 2197
I was wondering if a model trained on the GPU could be use to run inference with the cpu ? (And vice versa) Thanks to you!
Upvotes: 18
Views: 7901
Reputation: 57893
You can do it as long as your model doesn't have explicit device allocations. IE, if your model has blocks like with tf.device('gpu:0')
, it'll complain when you run it on model without GPU.
In such cases you must make sure your imported model doesn't have explicit device assignments, for instance, but using clear_devices
argument in import_meta_graph
Upvotes: 18