Reputation: 3637
I have created an async
python coroutine with async def
, and I would like to run it over every element on a list.
However, the coroutine launches a separate process, and my computer has limited resources, so I would like to only run n
of these coroutines at the same time. When one finishes, I would like another one to be started.
I'm still learning asyncio
and I'm lost how to do this within this framework.
I know I can run n
jobs concurrently by using something like this:
commands = asyncio.gather(*[run_command(f) for f in islice(my_large_list,n)])
# Run the commands
results = loop.run_until_complete(commands)
However, I don't know how to replace each job as it is completed.
Upvotes: 1
Views: 1297
Reputation: 30512
One option is to use asyncio.Semaphore
:
import asyncio
import random
s = asyncio.Semaphore(5)
async def my_coroutine(i):
async with s:
print("start", i)
await asyncio.sleep(random.uniform(1, 3))
print("end", i)
loop = asyncio.get_event_loop()
tasks = [my_coroutine(i + 1) for i in range(50)]
loop.run_until_complete(asyncio.gather(*tasks))
loop.close()
Update: concurrent.futures
would probably solve your problem in a much easier way than asycnio, since executors has a very simple max_workers argument:
import concurrent.futures
import time
import random
def my_routine(i):
print("start", i)
# Here you can use subprocess.* for anything, instead we will sleep:
time.sleep(random.uniform(1, 3))
print("end", i)
return "i={}".format(i)
with concurrent.futures.ProcessPoolExecutor(max_workers=5) as executor:
jobs = {executor.submit(my_routine, i + 1) for i in range(50)}
for fut in concurrent.futures.as_completed(jobs):
print(fut.result())
print('done')
Upvotes: 3