Reputation: 403
I use the multiprocessing package for a multi-layered application and I want to share multiple dicts within a few processes.
I found the multiprocessing.Manager class already but it seems like it provides only one dict (Manager.dict()) per application. For me it looks like something like a singleton object.
Can anybody help me?
Upvotes: 1
Views: 852
Reputation: 310237
As far as I can tell, a single Manager
instance can manage multiple independent dict
. Here's a simple example:
from multiprocessing import Process, Manager
def f(x, d1, d2):
if x == 1:
d1['foo'] = 'bar'
if x == 2:
d2['bar'] = 'foo'
print x, d1, d2
return x*x
if __name__ == '__main__':
manager = Manager()
d1 = manager.dict()
d2 = manager.dict()
p1 = Process(target=f, args=(1, d1, d2))
p2 = Process(target=f, args=(2, d1, d2))
p3 = Process(target=f, args=(3, d1, d2))
processes = [p1, p2, p3]
for p in processes:
p.start()
for p in processes:
p.join()
I get the following output:
3 {'foo': 'bar'} {}
1 {'foo': 'bar'} {}
2 {'foo': 'bar'} {'bar': 'foo'}
There is variation depending on which process gets to the dict's lock the fastest:
mgilson:$ python ~/sandbox/test.py
3 {'foo': 'bar'} {}
1 {'foo': 'bar'} {}
2 {'foo': 'bar'} {'bar': 'foo'}
mgilson:$ python ~/sandbox/test.py
1 {'foo': 'bar'} {'bar': 'foo'}
2 {'foo': 'bar'} {'bar': 'foo'}
3 {'foo': 'bar'} {'bar': 'foo'}
mgilson:$ python ~/sandbox/test.py
1 {'foo': 'bar'} {'bar': 'foo'}
3 {'foo': 'bar'} {'bar': 'foo'}
2 {'foo': 'bar'} {'bar': 'foo'}
mgilson:$ python ~/sandbox/test.py
1 {'foo': 'bar'} {}
2 {'foo': 'bar'} {'bar': 'foo'}
3 {'foo': 'bar'} {'bar': 'foo'}
but it is obvious that in each case, the dictionaries are distinct (they have different keys).
FWIW, I'm using OS-X. There are some subtleties in how multiprocessing
works on windows vs other *Nix systems that may come into play here...
Upvotes: 1