Reputation: 1433
I tried to speed up a calculation using Pool from the multiprocessing package. While I did get a significant speedup I'm missing more and more values as I increase the core/worker count.
I share my variables with all processes through the mp.value() class.
Where did i go wrong and how can i fix this?
poss = [x+1 for x in range(20)]
all_rolls = itertools.product(poss, repeat=6)
win = mp.Value('i', 0)
draw = mp.Value('i', 0)
loose = mp.Value('i', 0)
def some_func(roll):
if(comparison on rolls):
win.value += 1
elif(other comparison):
draw.value +=1
else:
loose.value +=1
with Pool(8) as p:
p.map(some_func, all_rolls)
On 16 cores i got 55,923,638 values instead of 64,000,000
Upvotes: 2
Views: 101
Reputation: 2557
In addition to what @jfowkes answered, note that you can use each Value with its own lock, which might make things a bit faster:
win = mp.Value('i', lock = True)
draw = mp.Value('i', lock = True)
loose = mp.Value('i', lock = True)
def some_func(roll):
if(comparison on rolls):
with win.get_lock() :
win.value += 1
elif(other comparison):
with draw.get_lock():
draw.value +=1
else:
with loose.get_lock():
loose.value +=1
Upvotes: 2
Reputation: 1565
You need to protect the modification of your values with Lock
(see this article).
from multiprocessing import Lock
lock = Lock()
def some_func(roll):
with lock:
if(comparison on rolls):
win.value += 1
elif(other comparison):
draw.value +=1
else:
loose.value +=1
Upvotes: 3