Reputation: 2891
I have a simple service that gets a json and returns a json, those jsons are relativly large ~200Kb-~300Kb. Now I noticed that even if I don't do any processing on the json it still takes about 30-40 Milli seconds the full round trip (even from my own machine).
Here is my code:
from typing import Dict
from fastapi import FastAPI
app = FastAPI(title="My test")
@app.post("/large_json_test")
async def foo(request: dict) -> dict:
something = request.get("something") or "NA"
x = "whatever" + something
return {"request": request, "x": x}
when running the following bash:
time curl -XPOST "127.0.0.1:8080/large_json_test" -d @large.json -H "Content-Type: application/json" 1> /dev/null
returns the following results:
0.00s user 0.00s system 15% cpu 0.043 total
can also be 0.050 or so in total time!
Any suggestions of how can I make this faster?
Worth mentioning, when running with a small json request it will take 10Millis, which means that reading serializing and writing back the json takes about 30 - 40 milli seconds.
Upvotes: 1
Views: 3896
Reputation: 2891
So as luk2302 suggested adding building the response using orjson
or ujson
make it significantly faster
code:
@app.post("/large_json_test1")
async def foo1(request: dict) -> Response:
something = request.get("something") or "NA"
x = "whatever" + something
res = {"request": request, "x": x}
return Response(
content=ujson.dumps(res)
)
@app.post("/large_json_test2")
async def foo2(request: dict) -> Response:
something = request.get("something") or "NA"
x = "whatever" + something
res = {"request": request, "x": x}
return Response(
content=orjson.dumps(res)
)
results:
ujson
curl -XPOST "127.0.0.1:8080/large_json_test1" -d @large.json -H > 0.00s user 0.00s system 43% cpu 0.014 total
orjson
curl -XPOST "127.0.0.1:8080/large_json_test2" -d @example_session.json -H > 0.00s user 0.00s system 41% cpu 0.016 total
Upvotes: 1