Skip to main content

FastAPI Best Practices

Limit Only One Access to Endpoint at a Time

Limit only one access to an endpoint at a time with asyncio.Lock in asyncio in FastAPI.

../code-snippets/python/app_request_lock.py
# main.py
from fastapi import FastAPI
import asyncio

app = FastAPI()
lock = asyncio.Lock()

counter = 0


@app.post("/limit")
async def func():
global counter
async with lock:
print("Hello")
counter = counter + 1
await asyncio.sleep(2)
print("bye")
await asyncio.sleep(2)
return {"counter": counter}


"""
Make 2 requests at a time, output from server:

INFO: 127.0.0.1:60228 - "POST /limit HTTP/1.1" 200 OK
Hello
bye
INFO: 127.0.0.1:51010 - "POST /limit HTTP/1.1" 200 OK
Hello
bye
INFO: 127.0.0.1:51022 - "POST /limit HTTP/1.1" 200 OK

Request 1:

❯ curl -X 'POST' \
'http://127.0.0.1:8000/limit' \
-H 'accept: application/json' \
-d ''
{"counter":1}%

Request 2:

❯ curl -X 'POST' \
'http://127.0.0.1:8000/limit' \
-H 'accept: application/json' \
-d ''
{"counter":2}%
"""

NOTE: The asyncio.Lock only take effect in the asyncio loop level, if using unicorn to run server in multiple processes, it can not lock the request!

No limitation.

../code-snippets/python/app_request_nolock.py
# main.py
from fastapi import FastAPI
import asyncio

app = FastAPI()
lock = asyncio.Lock()

counter = 0


@app.post("/limit")
async def func():
global counter
print("Hello")
counter = counter + 1
await asyncio.sleep(2)
print("bye")
await asyncio.sleep(2)
return {"counter": counter}


"""
Make 2 requests at a time, output from server:

Hello
Hello
bye
bye
INFO: 127.0.0.1:45160 - "POST /limit HTTP/1.1" 200 OK
INFO: 127.0.0.1:45172 - "POST /limit HTTP/1.1" 200 OK

Request 1:

❯ curl -X 'POST' \
'http://127.0.0.1:8000/limit' \
-H 'accept: application/json' \
-d ''
{"counter":2}%

Request 2:

❯ curl -X 'POST' \
'http://127.0.0.1:8000/limit' \
-H 'accept: application/json' \
-d ''
{"counter":2}%
"""

Limit only one access to an endpoint at a time with thread.Lock

Limit only one access to an endpoint at a time with process.Lock

Attach A Background Service Into the Application

Run a background service behind the FastAPI server:

  • share the same asyncio main loop with the server
  • the service start when the server starts and stop when the server stops
  • it should be light-weight and non-CPU heavy workload

Coroutines and Tasks — Python 3.11.4 documentation Event Loop — Python 3.11.4 documentation

../code-snippets/python/app_background_service.py
from fastapi import FastAPI
import asyncio
import os

app = FastAPI()


class BackgroundService:
def __init__(self, loop: asyncio.AbstractEventLoop, tasks: list):
self.loop = loop
self.running = False

async def work(self):
print(f"Start background service")
while True:
print(f"Run background service...")
# Sleep for 1 second
await asyncio.sleep(1)

async def start(self):
self.task = self.loop.create_task(self.work())

async def stop(self):
self.task.cancel()
try:
await self.task
except asyncio.CancelledError:
print("Clean up background service")


service = BackgroundService(asyncio.get_running_loop())


@app.on_event("startup")
async def startup():
print(f"PID[{os.getpid()}] app startup")
# schedule a task on main loop
await service.start()


@app.on_event("shutdown")
async def shutdown():
# close ProcessPoolExecutor
print(f"PID[{os.getpid()}] app shutdown")
await service.stop()


@app.post("/")
async def hello():
return {"value": f"hello world [{service.task.done()}] [{service.task.get_name()}]"}