Andrea
Andrea

Reputation: 39

How can I send 5000 JSON request as fast as possible in python?

Let me first explain what I am going to do indeed I am testing my web server API. I am beginner in python and would like send the following request 5000 times as fast as possible (one second or less is perfect). The only things is important to me is that these 5000 requests arrive at my server at the same time and I can find out the server ability. My request in bash is

  curl 'https://myserver.com/api/order' 
  -H 'Accept: application/json, text/plain, */*' 
 --data-binary '{"id":"ID201","financeId":1,"name":name,"family":family,"side":0,"validityType":99}'

Upvotes: 0

Views: 1825

Answers (3)

Booboo
Booboo

Reputation: 44003

This is similar to Maxim Dunavicher's answer in that it uses aiohttp to make asynchronous requests so that multiple requests can be done concurrently. Unlike his approach which attempts to keep the connection open for reuse among concurrent requests, this does not. However, when I benchmarked the performance of this on my local Apache server using N = 100, I found this to complete in approximately a third of the time, for which I do not have a good explanation.

import asyncio
from aiohttp import ClientSession

N = 5000

async def get(url):
    async with ClientSession() as session:
        async with session.get(url) as response:
            return await response.read()

loop = asyncio.get_event_loop()

coroutines = [get(f"http://localhost?x={i}") for i in range(N)]
results = loop.run_until_complete(asyncio.gather(*coroutines))
#print(results)

Upvotes: 1

Maxim Dunavicher
Maxim Dunavicher

Reputation: 635

you can probably use Grequest to basically use gevent for your requests:

import grequests

urls = [
    'http://www.heroku.com',
    'http://tablib.org',
    'http://httpbin.org',
    'http://python-requests.org',
    'http://kennethreitz.com'
]
rs = (grequests.get(u) for u in urls)

>>> grequests.map(rs)
[<Response [200]>, <Response [200]>, <Response [200]>, <Response [200]>, <Response [200]>]

another way to go is to use asyncio's event loop (similar to js), this approach is probably more modern, and it doesn't use gevent which is incompatible with some other third party libraries:

#!/usr/local/bin/python3.5
import asyncio
from aiohttp import ClientSession

async def fetch(url, session):
    async with session.get(url) as response:
        return await response.read()

async def run(r):
    url = "http://localhost:8080/{}"
    tasks = []

    # Fetch all responses within one Client session,
    # keep connection alive for all requests.
    async with ClientSession() as session:
        for i in range(r):
            task = asyncio.ensure_future(fetch(url.format(i), session))
            tasks.append(task)

        responses = await asyncio.gather(*tasks)
        # you now have all response bodies in this variable
        print(responses)

def print_responses(result):
    print(result)

loop = asyncio.get_event_loop()
future = asyncio.ensure_future(run(4))
loop.run_until_complete(future)

Upvotes: 1

Anton Pomieshchenko
Anton Pomieshchenko

Reputation: 2167

For testing json requests it is better to use ab (apache benchmark tool).

http://httpd.apache.org/docs/2.4/programs/ab.html

Upvotes: 0

Related Questions