Reputation: 2930
I'd like to have a process pool in a mostly async I/O application, because sometimes CPU bound tasks need to be done that shouldn't stall the main application. Furthermore I want to limit the number of processes.
According to the documentation the right way is to use run_in_executor
. The code below works but it doesn't terminate the processes after the work was done.
import asyncio
from concurrent.futures.process import ProcessPoolExecutor
class App:
def __init__(self):
self.process_pool = ProcessPoolExecutor(4)
self.loop = asyncio.get_event_loop()
async def get_regular(self):
return await regular()
async def get_expensive(self):
return await self.loop.run_in_executor(
self.process_pool, expensive
)
How do you reuse processes in the process pool or terminate them to obey the upper limit?
Upvotes: 0
Views: 462
Reputation: 6789
The process pool will have strange behavior if you reuse it. So I suggest to create a new pool every time and wrap it in a with
structure as demonstrated in the Example.
If you insist to reuse the pool, the responsibility to manage its life time falls on your shoulder. After usage, you can kill all subprocesses in the pool by
self.process_pool.shutdown()
Upvotes: 1