Omer Anisfeld
Omer Anisfeld

Reputation: 1302

calling apply async inside apply async python function

i am trying to call a pool with inside apply_async function , i get error of serialize object when i tried to pass one function the pool of another function so i moved the second pool to be global but still it not worked for me, what am i missing ? my code :

from multiprocessing import Pool
b_pool = Pool(1)

def func_a(i):
    global b_pool
    print "a : {}".format(i)
    try:
        res = b_pool.apply_async(func_b, args=(i,))
    except Exception as e:
        print e

def func_b(i):
    print "b : {}".format(i)
    file = "/home/ubuntu/b_apply.txt"
    f = open(file, "a")
    f.write("b : {}".format(i))
    f.close()


if __name__ == '__main__':
    a_pool = Pool(1)
    for i in range(10):
       res =  a_pool.apply_async(func_a,args=(i,) )

    a_pool.close()
    a_pool.join()

    b_pool.close()
    b_pool.join()

in this code only a is printing 0 -9 and b not printing not even to file. i am using python 2.7

Upvotes: 0

Views: 89

Answers (1)

Omer Anisfeld
Omer Anisfeld

Reputation: 1302

the queue was good direction , just that that multiprocessing.Queue can't be passed like this but Manager.Queue is the correct way of doing this , my code that worked :

from multiprocessing import Pool,Manager
def func_a(i,q):
    print "a : {}".format(i)
    try:
        q.put(i)
    except Exception as e:
        print e


def func_b(i,q):
    i = q.get()
    print "b : {}".format(i)


if __name__ == '__main__':
    m = Manager()
    q = m.Queue()
    a_pool = Pool(1)
    b_pool = Pool(1)

    for i in range(10):
        res = a_pool.apply_async(func_a,args=(i,q,) )
        res_2 = b_pool.apply_async(func_b, args=(i,q,))

    a_pool.close()
    a_pool.join()

    b_pool.close()
    b_pool.join()

this answer Sharing a result queue among several processes was very helpful

Upvotes: 1

Related Questions