Xudong
Xudong

Reputation: 525

Multiprocessing in python won't releae memory

I am running a multiprocessing code. The framework of the code is something like below:

def func_a(x):
    #main function here
    return result

def func_b(y):
    cores = multiprocessing.cpu_count() - 1
    pool = multiprocessing.Pool(processes=cores)
    results = pool.map(func_a, np.arange(1000)
    return results

if __name__ == '__main__':
     final_resu = []
     for i in range(0, 200):
         final_resu.append(func_b(i))

The problem I found in this code has two problems: Firstly, the memory continues going up during the loop. Secondly, in the task manager (windows10), the number of python executions increased step-wise, i.e. 14 to 25, to 36, to 47... with every iteration finished in the main loop.

I believe it has something wrong with the multiprocessing, but I'm not sure how to deal with it. It looks like the multiprocessing in func_b is not deleted when the main loop finished one loop?

Upvotes: 0

Views: 88

Answers (1)

Tim Peters
Tim Peters

Reputation: 70582

As the examples in the docs show, when you're done with a Pool you should shut it down explicitly, via pool.close() followed by pool.join().That said, it would be better still if, in addition, you created your Pool only once - e.g., pass a Pool as an argument to func_b(). and create it - and close it down - only once, in the __name__ == '__main__' block.

Upvotes: 2

Related Questions