Nitro
Nitro

Reputation: 1293

Python multiprocessing queue get and put

I am trying to use python multiprocessing to fill a queue with strings and then print them back out but am having trouble. Could someone point out what I am doing wrong?

import multiprocessing

my_q = multiprocessing.Queue()
my_list  =[i for i in range(0,100)]

def enqueue(q):
    for data in my_list:
        q.put(data)

def get_it(q):
    while not q.empty():
        item = q.get()
        print(item)


p1 = multiprocessing.Process(target=enqueue, args=(my_q,))
p2 = multiprocessing.Process(target=get_it, args=(my_q,))
p1.start()
p2.start()

p1.join()
p2.join()

This program executes without printing anything.

Upvotes: 4

Views: 6071

Answers (1)

falsetru
falsetru

Reputation: 369134

If get_it is executed before the queue is populated, it will return immediately, printing nothing.

You need to make sure the queue is populated before the get_it is called.

For example, wait until enqueue is called until all items are enqueued:

...

p1 = multiprocessing.Process(target=enqueue, args=(my_q,))
p1.start()
p1.join()

p2 = multiprocessing.Process(target=get_it, args=(my_q,))
p2.start()
p2.join()

Or modifying get_it like below not to end too early:

...

def get_it(q):
    while True:
        item = q.get()
        if item is None:  # loop until sentinel value (None) appear.
            break
        print(item)


my_list.append(None)  # sentinel value to denote end of input value
p1 = multiprocessing.Process(target=enqueue, args=(my_q,))
p2 = multiprocessing.Process(target=get_it, args=(my_q,))
p1.start()
p2.start()
p1.join()
p2.join()

or use multiprocess.pool.Pool.map instead:

import multiprocessing.pool

def get_it(item):
    print(item)

pool = multiprocessing.pool.Pool()
pool.map(get_it, range(100))

Upvotes: 2

Related Questions