Ali Mohamad
Ali Mohamad

Reputation: 45

how can use some(numbers) cores of cpu from all cores, with tensorflow attributes

I'm trying to use the university server for my deep code, all CPU's core on the server is 64 but I have to use just 24 cores to everybody can use the server too. I try to limit my CPU resource. I search all StackOverflow to find a solution but all suggestion doesn't work for me for example downgrade tensorflow and use

config = tf.ConfigProto(allow_soft_placement=True, 
                        intra_op_parallelism_threads=ncpu,
                        inter_op_parallelism_threads=ncpu)

and some others solutions by using

import tensorflow as tf

tf.config.threading.set_intra_op_parallelism_threads(numb)
tf.config.threading.set_inter_op_parallelism_threads(numb)

I have to use TensorFlow version 2 or higher because I use 'kerastuner' package in my code

Upvotes: 2

Views: 488

Answers (1)

MichaelJanz
MichaelJanz

Reputation: 1815

If you have Admin rights on the server and its running a Version of Windows, you can simply restrict the resources via the task-manager.

If you want to do it by code... It looks like its a bug in Tensorflow, which might be fixed, regarding to the github issue.

You might want to try:

export OMP_NUM_THREADS=2
tf.config.threading.set_intra_op_parallelism_threads(2)
tf.config.threading.set_inter_op_parallelism_threads(1)

As this was reported working by Leslie-Fang. If this does not work for you, I guess your only option is to join the github discussion, until its fixed.

Upvotes: 2

Related Questions