Reputation: 2812
Pretty new to Async so sorry for now able to phrase the question quite accurately.
I tried to use async following the example (https://pythonprogramming.net/asyncio-basics-intermediate-python-tutorial/) hoping to get some speed benefit of multiple requests.get() than doing it in a sync way.
import asyncio
import time
import requests
async def get_text(url):
print(f"Load {url}")
data = requests.get(url).text
await asyncio.sleep(0.0001)
print(f"Finished loading {url}")
return data
async def main(name):
tasks = [loop.create_task(get_text(n)) for n in name]
await asyncio.wait(tasks)
return tasks
if __name__ == '__main__':
url = ['https://www.nytimes.com', 'https://news.yahoo.com']
s = time.perf_counter()
loop = asyncio.get_event_loop()
result = loop.run_until_complete(main(url))
loop.close()
e = time.perf_counter() - s
print(f"Time takes: {e:0.2f}s")
But this apparently is just the same as running the two requests in sequence. Can you point out what I did wrong and how to do it properly in order to save time? I saw another example using ThreadPoolExecutor and loop.run_in_executor() but I am not sure how to incorporate it.
Best regards,
J
Upvotes: 1
Views: 378
Reputation: 5387
requests
does synchronous IO, so each call to requests.get
is going to block the entire event loop. If you want async IO, use another library, such as aiohttp
.
Upvotes: 3
Reputation: 1979
you can achieve parallelism here by using Python multiprocessing. https://docs.python.org/3.7/library/multiprocessing.html Takes a bit of read, but essentially this allows you to start new non-blocking processes, which would run in parallel, executing your code / functionality in parallel. so, sure, you can make parallel request calls. Make an attempt to convert your code to use multiprocessing, and there in the process should you need help, would be happy to help!
Upvotes: 0