27

How can I properly utilize the asynchronous functionality in a FastAPI route?

The following code snippet takes 10 seconds to complete a call to my /home route, while I expect it to only take 5 seconds.

from fastapi import FastAPI
import time

app = FastAPI()

async def my_func_1():
    """
    my func 1
    """
    print('Func1 started..!!')
    time.sleep(5)
    print('Func1 ended..!!')

    return 'a..!!'

async def my_func_2():
    """
    my func 2
    """
    print('Func2 started..!!')
    time.sleep(5)
    print('Func2 ended..!!')

    return 'b..!!'

@app.get("/home")
async def root():
    """
    my home route
    """
    start = time.time()
    a = await my_func_1()
    b = await my_func_2()
    end = time.time()
    print('It took {} seconds to finish execution.'.format(round(end-start)))

    return {
        'a': a,
        'b': b
    }

I am getting the following result, which looks non asynchronous:

λ uvicorn fapi_test:app --reload
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [5116]
INFO:     Started server process [7780]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:51862 - "GET / HTTP/1.1" 404
Func1 started..!!
Func1 ended..!!
Func2 started..!!
Func2 ended..!!
It took 10 seconds to finish execution.
INFO:     127.0.0.1:51868 - "GET /home HTTP/1.1" 200

But, I am expecting FastAPI to print like below:

Func1 started..!!
Func2 started..!!
Func1 ended..!!
Func2 ended..!!
It took 5 seconds to finish execution.

Please correct me if I am doing anything wrong?

1
  • 1
    see Commented Oct 17, 2021 at 23:18

3 Answers 3

26

Perhaps a bit late and elaborating from Hedde's response above, here is how your code app looks like. Remember to await when sleeping, and gathering the awaitables - if you don't do it, no matter whether you use time.sleep() or asyncio.sleep() you will not have the two tasks run concurrently.

from fastapi import FastAPI
import time
import asyncio

app = FastAPI()

async def my_func_1():
    """
    my func 1
    """
    print('Func1 started..!!')
    await asyncio.sleep(5)
    print('Func1 ended..!!')

    return 'a..!!'

async def my_func_2():
    """
    my func 2
    """
    print('Func2 started..!!')
    await asyncio.sleep(5)
    print('Func2 ended..!!')

    return 'b..!!'

@app.get("/home")
async def root():
    """
    my home route
    """
    start = time.time()
    futures = [my_func_1(), my_func_2()]
    a,b = await asyncio.gather(*futures)
    end = time.time()
    print('It took {} seconds to finish execution.'.format(round(end-start)))

    return {
        'a': a,
        'b': b
    }

Sign up to request clarification or add additional context in comments.

4 Comments

is there any example that we could use without using asyncio.sleep(5)? thanks.
You could try to have some remote calls to an external API instead. If the first call takes n time to receive a response, and the second call takes m time, then the total waiting time will be max(n, m).
Alternatively, you could try to change the input argument in sleep to see if the behaviour holds, e.g. whenever the longest concurrent task has been executed, /home will return.
@y-a-prasad if the answer is useful, please consider accepting it.
20

time.sleep is blocking, you should use asyncio.sleep, there's also .gather and .wait to aggregate jobs. This is well documented within Python and FastAPI.

Comments

1

Chrome at least, blocks concurrent GET reuqests on the same URL (probably to get a chance to use the chached versin on the next one?)

Testing with one Chrome in Incognito should work, with "def" as well as with "async def".

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.