Fast API Request Handling
Hitesh

Hitesh @hiteshchawla

About: Software Engineer with Full Stack web and Data Engineering skills

Location:
Remote, India
Joined:
Dec 29, 2023

Fast API Request Handling

Publish Date: May 19
11 3

In FastAPI, how route handlers (endpoints) behave in terms of parallelism and concurrency depends on whether they are defined using async def or def, and whether the work inside them is I/O-bound or CPU-bound.

Image description

Here are the four combinations of route handlers and how they affect parallel or concurrent handling of requests:


✅ 1. async def with async I/O-bound work (e.g., await asyncio.sleep, database calls)

@router.get("/async-io")
async def async_io_route():
    await asyncio.sleep(2)
    return {"status": "async io"}
Enter fullscreen mode Exit fullscreen mode
  • Handled concurrently
  • Non-blocking — multiple such requests can be handled at the same time.
  • Best performance for I/O tasks like DB queries, network calls, file access.

✅ 2. async def with CPU-bound work (e.g., heavy computation, no await)

@router.get("/async-cpu")
async def async_cpu_route():
    result = sum(i * i for i in range(10**7))
    return {"result": result}
Enter fullscreen mode Exit fullscreen mode
  • Not truly concurrent for CPU-bound work.
  • Blocks the event loop — slows down other async endpoints.
  • BAD practice — use a thread pool for CPU-bound tasks instead.

✅ 3. def with CPU-bound work

@router.get("/sync-cpu")
def sync_cpu_route():
    result = sum(i * i for i in range(10**7))
    return {"result": result}
Enter fullscreen mode Exit fullscreen mode
  • Parallel execution via thread pool executor (Starlette/FastAPI handles this).
  • Slower than async I/O but doesn't block the event loop.
  • Suitable for CPU-bound work when properly limited.

✅ 4. def with I/O-bound work (e.g., time.sleep)

@router.get("/sync-io")
def sync_io_route():
    time.sleep(2)
    return {"status": "sync io"}
Enter fullscreen mode Exit fullscreen mode
  • Blocks thread and wastes resources.
  • Not concurrent nor parallel in a performant way.
  • Worst option — avoid using blocking I/O in sync routes.

Summary Table

Route Type I/O Type Concurrent? Notes
async def Async I/O ✅ Yes Best option for scalable I/O-bound endpoints
async def CPU-bound ❌ No Blocks the event loop — BAD
def CPU-bound ✅ Parallel Runs in thread pool — acceptable for CPU tasks
def Blocking I/O ❌ No Blocks threads — worst case, avoid

Best Practices

  • Use async def + await for I/O-bound operations.
  • Offload CPU-heavy operations to a thread/process pool (e.g., run_in_executor()).
  • Avoid blocking operations like time.sleep() in FastAPI routes.

Here’s a clear and concise table showing different FastAPI route types, the kind of operation they perform, and whether the request handling is parallel or concurrent:


🧩 FastAPI Route Behavior Comparison

Route Type Operation Type Example Code Snippet Behavior Notes
async def Async I/O-bound await asyncio.sleep(1) ✅ Concurrent Best for DB queries, API calls, file I/O, etc.
async def CPU-bound sum(i * i for i in range(10**7)) ❌ Blocking Blocks event loop – BAD pattern
async def CPU-bound (offload) await loop.run_in_executor(None, cpu_task) ✅ Parallel Offloads to thread pool – does not block event loop
async def CPU-bound (multi-core) run_in_executor(ProcessPool, cpu_task) ✅✅ True Parallel Uses multiple CPU cores – best for heavy computations
def CPU-bound sum(i * i for i in range(10**7)) ✅ Parallel Runs in thread pool – doesn't block event loop
def Blocking I/O time.sleep(2) ❌ Blocking Wastes threads – avoid blocking I/O in sync functions

✅ Legend

  • Concurrent: Multiple tasks share the same thread (async I/O).
  • Parallel: Tasks run in separate threads or processes simultaneously.
  • Blocking: One task prevents others from proceeding.

Comments 3 total

  • Ansil Graves
    Ansil GravesMay 27, 2025

    This is a good overview, but I feel like it glosses over how thread pools can become a bottleneck themselves, especially with high traffic. Maybe a bit more detail about thread pool limitations and tuning would be helpful?

    • Javier de Toro
      Javier de ToroAug 5, 2025

      Totally agree. I came to this post precisely because I'm having the same problem. Following best practices, I send heavy CPU-bound tasks to other processes, and when it's not possible to make non-blocking I/O operations (using async libraries), I send them to a separate thread pool. But when running load API tests, thread pools clearly become a bottleneck. Did you find any solution?

  • Dillion Huston
    Dillion HustonAug 13, 2025

    Nice write-up, Hitesh! I like how you explained the different route handler types I’ve definitely tripped up a few times mixing sync and async code and wondering why things weren’t running smoothly.

    Also agree that offloading heavy tasks to threads is a must. FastAPI makes async stuff easy, but you still gotta watch out for blocking code.

    Thanks for sharing this, super helpful for folks getting deeper into FastAPI!

Add comment