Reputation: 23
python3 queue.put() will block main process when queue size over particular value(1386).
I use 30 subprocesses and two queue to count an int number, each subprocess get number from first queue and then put this number to second queue. I can see all the subprocess close successfully, but main process are blocked. the ware thing is, when number length less then 1387, it works good. python version 3.7.0
#!/usr/bin/env python
from multiprocessing import Manager, Process, Lock, Queue
def work(q_in, o_out, process, lock):
print("process ", process, "start")
while 1:
lock.acquire()
if q_in.empty():
lock.release()
break
d1 = q_in.get(timeout=1)
o_out.put(d1*2)
print("in process ", process, " queue 2 size", o_out.qsize())
lock.release()
print("process ", process, "done")
if __name__ == '__main__':
length = 1386
q_in = Queue(length)
q_out = Queue(length)
for i in range(length):
q_in.put(i)
lock = Lock()
processes = list()
for i in range(30):
p = Process(target=work, args=(q_in, q_out, i, lock))
processes.append(p)
p.start()
[p.join() for p in processes]
print("main done")
when length less then 1386, I can see "main done", but length = 1387, all subprocesses closed but "main done" never show and main process keep running status
Upvotes: 2
Views: 2185
Reputation: 11
you can use put_nowait() which will raise queue.full exception whenever the maxsize is reached. You can gt mnore details from here
import Queue
q = Queue.Queue(1)
q.put_nowait("a")
try:
q.put_nowait("b")
except Queue.Full:
print "Queue is full."
source - https://www.kite.com/python/docs/queue.Queue.put_nowait
Upvotes: 1
Reputation: 25197
The problem is that nothing is consuming the data from q_out
. The workers are able to complete their work because the queue is buffered on their side, but (some of) the processes remain alive waiting to be able to flush the data to an underlying pipe. See https://bugs.python.org/issue29797 for more details.
The pipe seems to be able to hold 1386 items in its buffer in your case.
Upvotes: 1