Reputation: 3490
Consider the following case: There is a slow server which use about 200ms to handle a request (no including the network transfer time). And now, we need to send a bunch of requests every second.
After read this post, I have tried multi-thread, multi-process, twisted (agent.request) and eventlet. But the biggest speedup is only 6x, which is achieved via twisted and eventlet, both are using epoll.
The following code shows the test version with eventlet,
import eventlet
eventlet.monkey_patch(all=False, socket=True)
import requests
def send():
pile = eventlet.GreenPile(30)
for i in range(1000):
pile.spawn(requests.get, 'https://api.???.com/', timeout=1)
for response in pile:
if response:
print response.elapsed, response.text
Anyone could help me to make it clear why the speedup is so low? And is there any other mechanism could make it much faster?
Upvotes: 3
Views: 14968
Reputation: 931
I know this is an old post but someone might still need this.
If you want to do load testing but want to use python then you should use a tool like locust: http://locust.io/
Here is my solution which resulted in 10,000 requests in 10 seconds:
Needed Package: sudo pip install grequests
Code:
import grequests
import time
start_time = time.time()
# Create a 10000 requests
urls = ['http://www.google.co.il']*10000
rs = (grequests.head(u) for u in urls)
# Send them.
grequests.map(rs)
print time.time() - start_time # Result was: 9.66666889191
Upvotes: 8