Reputation: 778
I use multiprocessing lib to test python multi process, but I meet some problems. I have test code 1:
import multiprocessing
def test(name):
print 'processing.....'
tmp = 0
for i in xrange(1000000000):
tmp += i
print 'process done'
if __name__ == '__main__':
pools = multiprocessing.Pool()
for i in xrange(2):
pools.apply_async(test)
pools.close()
pools.join()
result is:
processing
processing
done
done
Code 2:
import multiprocessing
class Test:
def test(name):
print 'processing.....'
tmp = 0
for i in xrange(1000000000):
tmp += i
print 'process done'
if __name__ == '__main__':
t = Test()
pools = multiprocessing.Pool()
for i in xrange(4):
pools.apply_async(t.test)
pools.close()
pools.join()
this result is nothing, this pools don't call t.test! I can't understand what happended. Why is this?
Upvotes: 0
Views: 275
Reputation: 4570
instead of using pool, you can simply collect the jobs in a list:
import multiprocessing
class Test(multiprocessing.Process):
def run(self):
print 'processing.....'
tmp = 0
for i in xrange(10):
tmp += i
print 'process done'
return 1
if __name__ == '__main__':
jobs = []
for i in range(5):
t = Test()
jobs.append(t)
t.start()
the list jobs will be able to tell you if the process has finished or not, ultimately giving you the same effect as using pool.
if you wanna make sure that all jobs are done:
if __name__ == '__main__':
jobs = []
for i in range(5):
t = Test()
jobs.append(t)
t.start()
not_done = any(job.is_alive() for job in jobs)
while not_done:
not_done = any(job.is_alive() for job in jobs)
print 'job all done'
Upvotes: 1