vladimir.gorea
vladimir.gorea

Reputation: 661

asyncio tasks using aiohttp.ClientSession

I'm using python 3.7 and trying to make a crawler that can go multiple domains asynchronously. I'm using for this asyncio and aiohttp but i'm experiencing problems with the aiohttp.ClientSession. This is my reduced code:

import aiohttp
import asyncio

async def fetch(session, url):
    async with session.get(url) as response:
        print(await response.text())

async def main():
    loop = asyncio.get_event_loop()
    async with aiohttp.ClientSession(loop=loop) as session:
        cwlist = [loop.create_task(fetch(session, url)) for url in ['http://python.org', 'http://google.com']]
        asyncio.gather(*cwlist)

if __name__ == "__main__":
    asyncio.run(main())

The thrown exception is this:

_GatheringFuture exception was never retrieved future: <_GatheringFuture finished exception=RuntimeError('Session is closed')>

What am i doing wrong here?

Upvotes: 0

Views: 2895

Answers (1)

Jean-Paul Calderone
Jean-Paul Calderone

Reputation: 48335

You forgot to await the asyncio.gather result:

    async with aiohttp.ClientSession(loop=loop) as session:
        cwlist = [loop.create_task(fetch(session, url)) for url in ['http://python.org', 'http://google.com']]
        await asyncio.gather(*cwlist)

If you ever have an async with containing no await expressions you should be fairly suspicious.

Upvotes: 3

Related Questions