C. Güzelhan
C. Güzelhan

Reputation: 171

CPU/GPU Memory Usage with Tensorflow

I want to run a Python script that also uses Tensorflow on a server. When I ran it with no session configuration, the process allocated all of GPU memory, preventing any other process to access to GPU.

The server specs are the following:

This server is shared among other colleagues, so I am not really allowed to allocate all of the GPU memory.

On the website of Tensorflow, I found out these instructions to set a threshold to the used GPU memory.

config = tf.ConfigProto()
config.gpu_options.per_process_gpu_memory_fraction = 0.4
session = tf.Session(config=config, ...)

I have two questions regarding these: 1. If the allocated GPU memory is not enough, will the process automatically use the CPU instead, or will it crash ? 2. What happens if a process wants to use the GPU but the GPU is already fully allocated ?

Thank you.

Upvotes: 1

Views: 4145

Answers (2)

Riley Patterson
Riley Patterson

Reputation: 11

Tensorflow provides a few options as alternatives to its default behavior of allocating all available GPU memory (which it does to avoid memory fragmentation and operate more efficiently). These options are:

  • config.gpu_options.allow_growth - when configured to True will dynamically allocate more memory as needed, but will never release memory
  • config.gpu_options.per_process_gpu_memory_fraction - when configured to a double between 0 and 1, will statically allocate only that fraction of available memory instead of all memory

See https://www.tensorflow.org/tutorials/using_gpu#allowing_gpu_memory_growth for more detail.

Upvotes: 0

gidim
gidim

Reputation: 2323

  1. If the allocated GPU memory is not enough TF will throw an Out Of Memory error and crash.

  2. TF will also crash in this case.

Upvotes: 2

Related Questions