Reputation: 356
I'm writing a Python program that will load a wordlist from a text file and then try unzipping an archive with each word. It wouldn't be serious if it didn't make use of all cpu cores. Because of the GIL, threading in Python isn't a great option if I'm not mistaken.
So I want to get the number of cpu_cores, split the wordlist and use the multiprocessing.process module to process different parts of the wordlist in different processes.
But would every process get pinned to a cpu core automatically? If not, is there a way to pin them manually?
Upvotes: 2
Views: 9871
Reputation: 1220
You can use Pythons multiprocessing
by importing
import multiprocessing as mp
and find out the number of processors by using mp.cpu_count()
and should work on most platforms.
To launch programs/processes on specific CPU cores (in linux) you can use taskset
and use this guide as a reference.
An alternative cross-plattform solution would be to use the psutil
package for python.
However i would suggest you go with a thread/process pooling approach as in my opinion you should let the operating system assign tasks to each cpu/core. You can look at How to utilize all cores with python multiprocessing on how to approach this problem.
Upvotes: 6