Allocating gpu fraction eager execution

Basically, I'm running a reinforcement learning model in eager mode and I need to limit the amount of memory that each process will claim from the gpu. In the graph api, this could be achieved by modifying a tf.ConfigProto() object and creating a session with said config object.

However, in eager api, there is no session. My doubt then is, how can I manage gpu memory in this case?

Upvotes: 2

Views: 363

Answers (1)

ash
ash

Reputation: 6751

tf.enable_eager_execution() accepts a config argument, whose value would be the same ConfigProto message.

So, you should be able to set the same options per-process using that.

Hope that helps.

Upvotes: 4

Related Questions