All posts

Fix MemoryError in FastAPI

Resolve MemoryError in FastAPI applications caused by large request bodies, synchronous blocking, and improper async resource handling.

MemoryError in FastAPI

FastAPI runs on uvicorn with async I/O. Memory errors occur when large payloads are buffered in memory or when synchronous operations block the event loop and pile up concurrent requests.

Large Request Bodies

FastAPI reads the entire request body into memory by default:

# BAD — entire file in memory
@app.post("/upload")
async def upload(file: UploadFile):
    contents = await file.read()  # 500MB file = crash
    process(contents)

# GOOD — stream the file
@app.post("/upload")
async def upload(file: UploadFile):
    with open("/tmp/upload", "wb") as f:
        while chunk := await file.read(8192):
            f.write(chunk)

Streaming Responses

from fastapi.responses import StreamingResponse

@app.get("/download")
async def download():
    async def generate():
        async for chunk in fetch_large_data():
            yield chunk
    return StreamingResponse(generate(), media_type="application/octet-stream")

Background Task Accumulation

If background tasks are CPU-heavy, they accumulate on the event loop:

# BAD — blocks the event loop
@app.post("/process")
async def process_data(data: dict):
    heavy_computation(data)  # Synchronous!

# GOOD — offload to thread pool
from fastapi.concurrency import run_in_threadpool

@app.post("/process")
async def process_data(data: dict):
    await run_in_threadpool(heavy_computation, data)

Uvicorn Worker Settings

uvicorn app.main:app --workers 4 --limit-max-requests 1000

--limit-max-requests recycles workers periodically, preventing memory accumulation.

Bugsly's Python SDK captures MemoryError and slow request alerts in FastAPI, with full async stack traces.

Try Bugsly Free

AI-powered error tracking that explains your bugs. Set up in 2 minutes, free forever for small projects.

Get Started Free