Reputation: 85
In general, using pool and starmap, if we have
if __name__ == '__main__':
with multiprocessing.Pool() as p:
temp_arr = p.starmap(process, tuple_list)
tuple_list = [(1,2), (3, 4)], e.g., results in process(1,2) and process(3,4), each assigned to a different processor.
If I have:
dict = {'0': [(1,1), (2,3)], '1': [(4,4)], '2': [(2,4), (3,5)]}:
is there a way I can use Pool so that all values of key '0' go to the first processor in one go (as a tuple-list [(1,1), (2,3)], e.g., so that I can process each tuple separately inside process() later on), key '1' values goes to the second processor and so on.
Thanks in advance.
Upvotes: 0
Views: 206
Reputation: 143197
You can use map()
with dict.values()
import multiprocessing as mp
dict = {
'0': [(1,1), (2,3)],
'1': [(4,4)],
'2': [(2,4), (3,5)]
}
def process(data):
print(f"process data: {data}")
#return result
with mp.Pool() as p:
all_results = p.map(process, dict.values())
Result:
process data: [(1, 1), (2, 3)]
process data: [(4, 4)]
process data: [(2, 4), (3, 5)]
Upvotes: 1
Reputation: 1708
Try this:
import multiprocessing as mp
import time
dict = {'0': [(1,1), (2,3)], '1': [(4,4)], '2': [(2,4), (3,5)]}
def process(tup):
print(f"input tuple: {tup} -- worker_id: {mp.current_process()}\n")
time.sleep(2)
def process_all(index):
for tup in dict[index]:
process(tup)
with mp.Pool() as p:
temp_arr = p.starmap(process_all, dict.keys())
# Result
#input tuple: (1, 1) -- worker_id: <ForkProcess(ForkPoolWorker-121, started daemon)>
#input tuple: (2, 4) -- worker_id: <ForkProcess(ForkPoolWorker-123, started daemon)>
#input tuple: (4, 4) -- worker_id: <ForkProcess(ForkPoolWorker-122, started daemon)>
#input tuple: (3, 5) -- worker_id: <ForkProcess(ForkPoolWorker-123, started daemon)>
#input tuple: (2, 3) -- worker_id: <ForkProcess(ForkPoolWorker-121, started daemon)>
This is exactly what you want to ?
Upvotes: 1