Y A Prasad
Y A Prasad

Reputation: 391

How to get Python FastAPI async/await functionality to work properly?

How can I properly utilize the asynchronous functionality in a FastAPI route?

The following code snippet takes 10 seconds to complete a call to my /home route, while I expect it to only take 5 seconds.

from fastapi import FastAPI
import time

app = FastAPI()

async def my_func_1():
    """
    my func 1
    """
    print('Func1 started..!!')
    time.sleep(5)
    print('Func1 ended..!!')

    return 'a..!!'

async def my_func_2():
    """
    my func 2
    """
    print('Func2 started..!!')
    time.sleep(5)
    print('Func2 ended..!!')

    return 'b..!!'

@app.get("/home")
async def root():
    """
    my home route
    """
    start = time.time()
    a = await my_func_1()
    b = await my_func_2()
    end = time.time()
    print('It took {} seconds to finish execution.'.format(round(end-start)))

    return {
        'a': a,
        'b': b
    }

I am getting the following result, which looks non asynchronous:

λ uvicorn fapi_test:app --reload
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [5116]
INFO:     Started server process [7780]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:51862 - "GET / HTTP/1.1" 404
Func1 started..!!
Func1 ended..!!
Func2 started..!!
Func2 ended..!!
It took 10 seconds to finish execution.
INFO:     127.0.0.1:51868 - "GET /home HTTP/1.1" 200

But, I am expecting FastAPI to print like below:

Func1 started..!!
Func2 started..!!
Func1 ended..!!
Func2 ended..!!
It took 5 seconds to finish execution.

Please correct me if I am doing anything wrong?

Upvotes: 27

Views: 48166

Answers (3)

Robert Verdes
Robert Verdes

Reputation: 400

Chrome at least, blocks concurrent GET reuqests on the same URL (probably to get a chance to use the chached versin on the next one?)

Testing with one Chrome in Incognito should work, with "def" as well as with "async def".

Upvotes: 1

Mattia Paterna
Mattia Paterna

Reputation: 1366

Perhaps a bit late and elaborating from Hedde's response above, here is how your code app looks like. Remember to await when sleeping, and gathering the awaitables - if you don't do it, no matter whether you use time.sleep() or asyncio.sleep() you will not have the two tasks run concurrently.

from fastapi import FastAPI
import time
import asyncio

app = FastAPI()

async def my_func_1():
    """
    my func 1
    """
    print('Func1 started..!!')
    await asyncio.sleep(5)
    print('Func1 ended..!!')

    return 'a..!!'

async def my_func_2():
    """
    my func 2
    """
    print('Func2 started..!!')
    await asyncio.sleep(5)
    print('Func2 ended..!!')

    return 'b..!!'

@app.get("/home")
async def root():
    """
    my home route
    """
    start = time.time()
    futures = [my_func_1(), my_func_2()]
    a,b = await asyncio.gather(*futures)
    end = time.time()
    print('It took {} seconds to finish execution.'.format(round(end-start)))

    return {
        'a': a,
        'b': b
    }

Upvotes: 26

Hedde van der Heide
Hedde van der Heide

Reputation: 22449

time.sleep is blocking, you should use asyncio.sleep, there's also .gather and .wait to aggregate jobs. This is well documented within Python and FastAPI.

Upvotes: 20

Related Questions