Reputation: 1
Below is my demo:
Thread xxx:
import xxx
import xxxx
def test_process1(hostname,proxy_param):
# just never run
try: # breakpoint 0
with open("/xxx","w+") as f: # breakpoint 1
f.write("something")
except Exception as e:
pass # just never run breakpoint 3
def test():
try:
a = Process(target=test_process1, args=(hostname,proxy_param))
a.start()
a.join() # you are blocking here. test_process1 not working and never quit
except Exception as e:
pass # breakpoint 4
function test_process1
just never run. No error, No breakpoint.
The test function code is in a big project, here is a demo.
Upvotes: 0
Views: 216
Reputation: 1219
Hope! this piece of code helps.
Workers list will get divided based on the number of processes in use.
Sample Code with ManagerList.
from subprocess import PIPE, Popen
from multiprocessing import Pool,Pipe
from multiprocessing import Process, Queue, Manager
def child_process(child_conn,output_list,messenger):
input_recvd = messenger["input"]
output_list.append(input_recvd)
print(input_recvd)
child_conn.close()
def parent_process(number_of_process=2):
workers_inputs = [{"input":"hello"}, {"input":"world"}]
with Manager() as manager:
processes = []
output_list = manager.list() # <-- can be shared between processes.
parent_conn, child_conn = Pipe()
for single_id_dict in workers_inputs:
pro_obj = Process(target=child_process, args=(child_conn,output_list,single_id_dict)) # Passing the list
pro_obj.start()
processes.append(pro_obj)
for p in processes:
p.join()
output_list = [single_feature for single_feature in output_list]
return output_list
parent_process()
OUTPUT:
hello
world
['hello', 'world']
ManagerList is useful to get the output from various parallel process it's like an inbuilt Queue Mechanism with easy to use and safe from deadlocks.
Upvotes: 1