Reputation:
Currently using threads to make multiple "asynchronous" requests to download files. It has been suggested to me to look into using asyncio
now that we have upgraded to Python 3+.
We have to use ssl.SSLContext(protocol = ssl.PROTOCOL_TLS)
and pass PEM and KEY files.
Can I just work to convert my http.client
script to be used with asyncio
or do I need to convert the http.client
functions to aiohttp
functions?
Bonus:
Either way, could somebody outline how to choose between async def
and @asyncio.coroutine
?
Supporting Info:
My program performs fairly well considering that our last one which was a sequential query and now works in parallel with around 15 concurrent requests. However another program I am working on sends upwards for 500 requests in parallel; I could use a threading.Semaphore()
but I have read that asyncio
is great for large volume requests.
First time messing around with it, so...
I'm reading the following (and like 80% of it seems like socket connection for non-blocking sockets, and the only applicable section is "Coordinating Coroutines").
Upvotes: 2
Views: 1437
Reputation: 1124708
You can use aiohttp
; you'll need to translate your http.client
code to use aiohttp
client API. You can re-use your ssl.SSLContext()
object; pass it to a TCPConnector()
instance, then create a client from that:
import aiohttp
import ssl
SSL_CONTEXT = ssl.SSLContext(protocol=ssl.PROTOCOL_TLS)
SSL_CONTEXT.load_cert_chain(certfile='foo', keyfile='bar')
async def fetch_url(url):
conn = aiohttp.TCPConnector(ssl_context=SSL_CONTEXT)
async with aiohttp.ClientSession(conn=connector) as client:
async with client.get(url) as response:
print(resp.status)
print(await resp.text())
Note that you can't share the connector between client sessions; if you want to re-use, reuse the client.
Upvotes: 0