Reputation: 1
I am trying to multiprocess a method using list elements.
Every time, I need to pass group of arguments along with new list item. I tried below code, but not able to achieve working parallelism.
fun(list_element,arg1,arg2)
from multiprocessing import Process
from multiprocessing import Pool
def fun(each_item, arg1, arg2):
print('start fun')
print("fun in this function")
print('end fun')
def main():
compute_HostList = ['abc.xyz.com', 'def.xyz.com', 'mno.xyz.com']
#Given list
print("Given Compute list: ",compute_HostList)
# Each element as list
New_OS_Compute_List= [[x] for x in compute_HostList]
# Print
print("The new lists of lists: ",New_OS_Compute_List)
for each_item in NewList:
print("item_in_list:", each_item)
p = Process(target=fun, args=(each_item, arg1, arg2))
p.start()
p.join()
"""
#I also tried this using zip, but not working.
for every in New_OS_Compute_List:
tasks = [*zip(every, "OS DBAAS", "dbcs_patching")]
with Pool(5) as pool:
pool.starmap(decideTypeOfPatch, iterable=tasks)
"""
if __name__ == '__main__':
print('start main')
main()
print('end main')
Upvotes: 0
Views: 381
Reputation: 70582
When you do p.start()
immediately followed by p.join()
, it does what you told it to do ;-) That is, it runs the process, and then just sits there waiting for the process to finish. So you get no useful parallelism.
So instead of
p = Process(target=fun, args=(each_item, arg1, arg2))
p.start()
p.join()
save the process objects in a list instead:
plist = []
...
p = Process(target=fun, args=(each_item, arg1, arg2))
plist.append(p)
p.start()
and after that loop is over wait for them to end:
for p in plist:
p.join()
Upvotes: 1