dmzkrsk
dmzkrsk

Reputation: 2115

Iterating over asyncio coroutines/Tasks as soon as they are ready

I launch a bunch of requests using aiohttp. Is there a way to get the results one-by-one as soon as each request is complete?

Perhaps using something like async for? Or Python 3.6 async generators?

Currently I await asyncio.gather(*requests) and process them when all of them are completed.

Upvotes: 6

Views: 2894

Answers (3)

Andrew Svetlov
Andrew Svetlov

Reputation: 17366

Canonical way is pushing result into asyncio.Queue like in crawler example. Also it's wise to run limited amount for download tasks which get new job from input queue instead of spawning a million of new tasks.

Upvotes: 1

Mikhail Gerasimov
Mikhail Gerasimov

Reputation: 39546

asyncio has as_completed function that probably does what you need. Note, it returns regular iterator, not async.

Here's example of usage:

import asyncio


async def test(i):
    await asyncio.sleep(i)
    return i


async def main():
    fs = [
        test(1),
        test(2),
        test(3),
    ]

    for f in asyncio.as_completed(fs):
        i = await f  # Await for next result.
        print(i, 'done')



loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
try:
    loop.run_until_complete(main())
finally:
    loop.run_until_complete(loop.shutdown_asyncgens())
    loop.close()

Output:

1 done
2 done
3 done

Upvotes: 5

Ivan Klass
Ivan Klass

Reputation: 6627

As I understand according to the docs, requests are Futures (or can be easily converted to Future using asyncio.ensure_future).

A Future object has a method .add_done_callback. So, you can add your callback for every request, and then do gather.

Docs for Future.add_done_callback

Upvotes: 0

Related Questions