sjpark
sjpark

Reputation: 91

Python multiprocessing with Queue

I want to use multiprocessing in python. My code is like BFS. It stores the tasks in the queue, pulls them out one by one and executes them. If a new task is created during execution, it is stored in queue. And repeat this until the queue is empty. But I want to use multiprocessing here

Here is my code,

def task(queue):
  results = do_task(queue.get())
  for r in results:
    queue.put(r)

pool = mp.Pool(3)
queue = mp.Manager().Queue()

init_queue(queue) #queue.put(...)
while queue.qsize() > 0:
  pool.apply_async(task, queue)
  time.sleep(0.1)

When I run this code, the while loop exits before the task is done, so I need to use the time.sleep(..). But using the sleep function is not efficient. And there is no guarantee that the operation time of the task will always be shorter than the time to sleep..

So, Is there a way to check if a process in the pool is working?

like :

while queue.qsize() > 0:
  pool.apply_async(task, queue)
  time.sleep(0.1)

to

while queue.qsize() > 0 and check_func():
  pool.apply_async(task, queue)

Thanks!

Upvotes: 0

Views: 366

Answers (1)

rdas
rdas

Reputation: 21295

The apply_async method returns an AsyncResult object that you can use to poll for the result.

Here's an example from the docs:

# evaluate "f(20)" asynchronously
res = pool.apply_async(f, (20,))      # start the task
print(res.get(timeout=1))             # wait at most 1 sec for the task to finish

You can also use the ready() of that object to check if the call has completed without fetching the return value:

res.ready() # True if the call completed

Upvotes: 1

Related Questions