Reputation: 7333
I recently learned that the multiprocessing package Pool
with python requires you to call:
pool.close()
pool.join()
when you're finished in order to free the memory used for state in those processes. Otherwise, they persist and your computer will fill up with python jobs; they won't use the CPU, but will hog memory.
My question is this:
I'm now using celery
for parallelization (instead of Pool
-- I'm in operating within a Django WSGI app, and Pool
makes it difficult to prevent all users from forking jobs at once, which will crash the server).
I've noticed a very similar phenomenon with celery
: my 6 celery processes running in the background start to gobble up memory. Is this normal or is there the equivalent of calling close() and join() that I can use to tell a celery
task that I've retrieved the result and don't need it anymore?
Thanks a lot for the help.
Oliver
Upvotes: 3
Views: 283