Reputation: 2051
In my algorithm, I need to create a graph iteratively and use shared variables. The problem is that every time that I create a graph and use shared variables, Tensorflow creates new variables and allocates memory to them. This causes a growth in the used memory. Obviously at some point it consumes all the memory and gets killed.
So the solution is either of these:
1- Remove the variables created in the previous iteration and release the memory. Then create the new graph. But how?
2- After every iteration delete the graph and create, release the memory, and create a new one. But then I need to some how store the variables somewhere so that I can use them for creating the next graph (because variables are shared). But how?
What should I do?
Upvotes: 0
Views: 2683
Reputation: 5206
TensorFlow supports resetting a session. See reset in the session management for how to use it.
Upvotes: 1