sequence_hard
sequence_hard

Reputation: 5385

Share a list between different processes?

I have the following problem. I have written a function that takes a list as input and creates a dictionary for each element in the list. I then want to append this dictionary to a new list, so I get a list of dictionaries. I am trying to spawn multiple processes for this. My problem here is that I want the different processes to access the list of dictionaries as it is updated by other processes, for example to print something once the has reached a certain length.

My example would be like this:

import multiprocessing

list=['A', 'B', 'C', 'D', 'E', 'F']

def do_stuff(element):
    element_dict={}
    element_dict['name']=element
    new_list=[]
    new_list.append(element_dict)
    if len(new_list)>3:
        print 'list > 3'

###Main###
pool=multiprocessing.Pool(processes=6)
pool.map(do_stuff, list)
pool.close()

Right now my problem is that each process creates its own new_list. Is there a way to share the list between processes, such that all dictionaries are appended to the same list? Or is the only way to define the new_list outside of the function?

Upvotes: 23

Views: 25610

Answers (3)

Velimir Mlaker
Velimir Mlaker

Reputation: 10985

One way is to use a manager object and create your shared list object from it:

from multiprocessing import Manager, Pool

input_list = ['A', 'B', 'C', 'D', 'E', 'F']

manager = Manager()
shared_list = manager.list()

def do_stuff(element):
    element_dict = {}
    element_dict['name'] = element
    shared_list.append(element_dict)
    if len(shared_list) > 3:
        print('list > 3')

pool = Pool(processes=6)
pool.map(do_stuff, input_list)
pool.close()

Remember, unlike threads, processes do not share memory space. (When spawned, each process gets its own copy of the memory footprint of the spawning process, and then runs with it.) So they can only communicate via some form of IPC (interprocess communication). In Python, one such method is multiprocessing.Manager and the data structures it exposes, e.g. list or dict. These are used in code as easily as their built-in equivalents, but under the hood utilize some form of IPC (sockets probably).

Edit Feb 1, 2022: Removed unneeded global shared_list declaration from the function, since the object is not being replaced.

Upvotes: 34

hrdom
hrdom

Reputation: 168

can run in win10

import multiprocessing

list=['A', 'B', 'C', 'D', 'E', 'F']

def do_stuff(element,sharedlist):
    element_dict={}
    element_dict['name']=element
    sharedlist.append(element_dict)
    print(sharedlist)



if __name__ == "__main__":
    pool=multiprocessing.Pool(processes=6)
    manager=multiprocessing.Manager()
    sharedlist=manager.list()
    tasks = [(x,sharedlist) for x in list]
    pool.starmap(do_stuff, tasks)
    pool.close()

Upvotes: 0

Takintayo Akinbiyi
Takintayo Akinbiyi

Reputation: 19

the following is from python documentation:

from multiprocessing import shared_memory
a = shared_memory.ShareableList(['howdy', b'HoWdY', -273.154, 100, None, True, 42])
[ type(entry) for entry in a ]
[<class 'str'>, <class 'bytes'>, <class 'float'>, <class 'int'>, <class 'NoneType'>, <class 'bool'>, <class 'int'>]
a[2]
-273.154
a[2] = -78.5
a[2]
-78.5
a[2] = 'dry ice'  # Changing data types is supported as well
a[2]
'dry ice'
a[2] = 'larger than previously allocated storage space'
Traceback (most recent call last):
  ...
ValueError: exceeds available storage for existing str
a[2]
'dry ice'
len(a)
7
a.index(42)
6
a.count(b'howdy')
0
a.count(b'HoWdY')
1
a.shm.close()
a.shm.unlink()
del a  # Use of a ShareableList after call to unlink() is unsupported

Upvotes: 1

Related Questions