Pusheen_the_dev
Pusheen_the_dev

Reputation: 2197

Can a model trained on gpu used on cpu for inference and vice versa?

I was wondering if a model trained on the GPU could be use to run inference with the cpu ? (And vice versa) Thanks to you!

Upvotes: 18

Views: 7901

Answers (1)

Yaroslav Bulatov
Yaroslav Bulatov

Reputation: 57893

You can do it as long as your model doesn't have explicit device allocations. IE, if your model has blocks like with tf.device('gpu:0'), it'll complain when you run it on model without GPU.

In such cases you must make sure your imported model doesn't have explicit device assignments, for instance, but using clear_devices argument in import_meta_graph

Upvotes: 18

Related Questions