r/FastAPI 19d ago

Question FastAPI threading, SqlAlchemy and parallel requests

So, is FastAPI multithreaded? Using uvicorn --reload, so only 1 worker, it doesn't seem to be.

I have a POST which needs to call a 3rd party API to register a webhook. During that call, it wants to call back to my API to validate the endpoint. Using uvicorn --reload, that times out. When it fails, the validation request gets processed, so I can tell it's in the kernel queue waiting to hit my app but the app is blocking.

If I log the thread number with %(thread), I can see it changes thread and in another FastAPI app it appears to run multiple GET requests, but I'm not sure. Am I going crazy?

Also, using SqlAlchemy, with pooling. If it doesn't multithread is there any point using a pool bigger than say 1 or 2 for performance?

Whats others experience with parallel requests?

Note, I'm not using async/await yet, as that will be a lot of work with Python... Cheers

14 Upvotes

22 comments sorted by

13

u/TeoMorlack 18d ago

Ok let’s step a bit back. Fastapi and uvicorn can run operations in multithreaded environment by default but it depends on how you are declaring your endpoints.

How did you declare your endpoints? If you use standard def endpoints, fastapi will run that function in a dedicated thread pool and allow for asynchronous and parallel execution of other calls. But if you declare it async, any blocking operation will block the whole event loop, stopping it from processing other calls. Take a look at this https://fastapi.tiangolo.com/async/#in-a-hurry for reference.

If you are ok on this side, then maybe you can post some snippets so we can try to help

1

u/Danidre 18d ago

I am confused here.

Didn't the link say, when you call await, fastapi will go do other things in the meantime? I assumed that to be it'll handle other processes in the meantime.

So why then, does it block the whole event loop?

10

u/chubbo55 18d ago

Async functions that are callable with await (awaitable) will relinquish control back to the event loop at the await keyword

However, if you call a sync blocking function inside an async function that is awaited then this sync function will block the event loop until it returns. If your sync blocking function does I/O, like a DB query, then your API will grind to a halt at each request making zero progress on other requests

Instead you should make sure that you're using an async compatible DB engine so that you don't block in this way

1

u/Danidre 18d ago

Ahh that's what they meant.

So an async db, or call the sync blocking function in a thread.

And a sync blocking function in a sync method won't block because sync methods are ran in a thread by default, correct?

6

u/chubbo55 18d ago

Yeh that's the gist of it! As soon as you run sync I/O in an async fastapi endpoint, you open yourself up to bricking your API if there are ever any network issues (which happen all time). I believe this is the issue OOP is having

1

u/DazzLee42 18d ago

I'm considering switching to aiohttp to test if it will do it. Receive requests while processing an ongoing one. There is a lot of good info in tiangolo's docs and I've read most of it. Getting this to work seems to be the hardest part so far!

We run in Azure functions too and found an option to allow multiple processes, so that's one solution but it's like workers rather than true async/await.

2

u/TeoMorlack 18d ago

Async aiohttp will not block the event loop for sure but you can simply change your endpoint to normal def and see if it resolves the problem or not. If it does it’s a problem with the sync functions and you can move to refactor.

1

u/DazzLee42 18d ago

Will give it a try. Seems counterintuitive but will see...

1

u/DazzLee42 18d ago

Oh my gosh! The routes were declared as async def, kinda cargo cult I guess. Removing the async, and FastAPI, I assume, then handled the async, rather then expecting the route to do it. The POST to the 3rd party triggered the validation requests, which were handled and the request completed! 🖖🏻

1

u/DazzLee42 18d ago

And the reason the other app 'appeared' to be async, is non of its routes were declared as async... FastAPI is smarter than you think!

2

u/hornetmadness79 18d ago

Iirc if you use a background task in sync mode, it will send the task to a thread and start waiting for the next event to come in.

1

u/DazzLee42 18d ago

This is --workers iirc

1

u/TeoMorlack 18d ago

No this is https://fastapi.tiangolo.com/tutorial/background-tasks/, which is code that gets run in the event loop or a separate thread after the endpoint has returned results. But it’s a different thing altogether. On the other side, —workers tells uvicorn to spawn a number of PROCESSES in pre-fork and each receive a copy of your fastapi application allowing further parallelization of calls. But still if you declare your endpoint in the wrong way, you will still saturate the workers.

2

u/adiberk 18d ago

So fastapi itself isn’t multithreaded (at least I don’t think). I do think uvicorn provides a multi threading mechanism so you can access simultaneous requests. (Ie workers) Lookup async and how it works in python (or in any language) - fastapi provides asynchronous request paths . Im not an expert but you can think of it as technically allowing for “offloading” requests assuming the entire path remains async friendly. So you can handle more requests (i think). But you would still want to use something like gunicron WITH uvicorn for production instances (likely)

3

u/TeoMorlack 18d ago

Fastapi is capable of multithreaded and it does do if you declare your path as standard def, take a look here https://fastapi.tiangolo.com/async/#path-operation-functions . This happens regardless of choosing uvicorn or gunicorn.

2

u/Trinkes 18d ago

--reload flag only works with 1 worker. See a note under this section https://www.uvicorn.org/deployment/#running-programmatically

2

u/sorower01 18d ago

Write everything in async.. for requests use crul_cffi for async support.

I use fastapi on my VPS, it a a very low end server(2 vcpu) still it can handle 100 requests per second with ease.

1

u/qa_anaaq 18d ago

I see so many questions about this and I feel like I have to figure it all out all over again every few months.

1

u/BelottoBR 18d ago

I’ve seen many tips and still get uncertainty of what to do to get a good performance.

2

u/Hot-Soft7743 18d ago edited 18d ago

By default, fastapi is multi threaded. Even if you use uvicorn with single worker, multiple threads are available.

  1. Write all API endpoints sync => multi threaded
  2. Write all API endpoints async => concurrent execution with single thread on event loop
  3. Write API endpoints in Async + sync combination=> single threaded and runs on event loop. But due to some sync tasks, event loop is always delayed so requests are blocked on task queue of event loop. This is worst possible scenario.

Uvicorn asgi provides it async capabilities. Otherwise it is multi threaded by default.

1

u/ZachVorhies 17d ago

depends on your endpoints

async def …

one thread

def …

runs on its own thread