Reputation: 1
I'm trying to make a multi-process class objects run in all cpu cores in my python program. The code works, but it only runs in one core.
I wonder if the child processes are running sequentially, but I cant't find an easy way to test this or to avoid it.
My code looks something like this
#child processes
import multiprocessing as mp
import time
import random
class child(mp.Process):
def __init__(self,comm):
mp.Process.__init__(self)
self.comm = comm
def run(self):
self.score = self.doWork()
self.comm.put([self.score])
def doWork(self):
k = 0
for x in range(9999):
for y in range(9999):
k = k + 1
return random.randint(1,1000)
#main process
def runSubProcess():
list = []
queue = mp.Queue()
for p in range(4):
p = child(queue)
p.start()
p.join()
list.append(p)
stillRunning = True
while stillRunning:
stillRunning = False
for p in list:
if p.is_alive():
stillRunning = True
time.sleep(0.1)
while not queue.empty():
item = queue.get()
print (item)
if __name__ == "__main__":
runSubProcess()
I'm running python 3.8 64 bits in a 4-core environment using Windows 10
Version string:
Python 3.8.0a1 (tags/v3.8.0a1:e75eeb00b5, Feb 3 2019, 19:46:54) [MSC v.1916 32 bit (Intel)] on win32
Upvotes: 0
Views: 622
Reputation: 59218
Because you immediately join()
it after starting. That makes the main program wait for the child to finish.
Start all of them first, then join all of them in another loop, like this:
def runSubProcess():
list = []
queue = mp.Queue()
for p in range(4):
p = child(queue)
p.start()
list.append(p)
for p in list:
p.join()
while not queue.empty():
item = queue.get()
print (item)
Note that there's no need for sleeping, since join()
will "sleep" by itself, if needed - even better: without using any CPU.
Upvotes: 4