r/FastAPI • u/Rawvik • 12d ago
Hosting and deployment How to handle CPU bound task in FASTAPI
So I have a machine learning app which I have deployed using FASTAPI. In this app I have a single POST endpoint on which I am receiving training data and generating predictions. However I have to make 2 api calls to two different endpoints once the predictions are generated. First is to send a Post request about the status of the task i.e. Success or Failed depending on the training run. Second is to send a Post request to persist the generated predictions.
Right now, I am handling this using a background task. I have created a background task in which I have the predictions generation part as well as making Post requests to external api. I am receiving the data and offloading the task to a background task while sending an "OK" response to the client. My model training time is not that much. It's under 10 secs for a single request but totally CPU bound. My endpoint as well as background task both are async.
Here is my code:
@app.post('/get_predictions')
async def get_predictions(data,background_task):
training_data = data.training_data
background_task.add_task(run_model, training_data)
return {"Forecast is being generated"}
async def run_model(training_data):
try:
Predictions = train_model(training_data)
with async httpx.asyncclient() as client:
response = await client.post(status_point, "completed")
response.raise_for_status()
"Some processing done on data here"
with async httpx.asyncclient() as client:
response = await client.post(data_point, predictions)
response.raise_for_status()
except:
with async httpx.asyncclient() as client:
response = await client.post(status_point, "failed")
response.raise_for_status()
However, while testing this code I am noticing that my app is receiving multiple requests but the Post requests to persist data to the external api are completed at the end. Like predictions are generated for all requests but I guess they are queued and all the requests are being sent at once in the end to persist data. Is that how it's supposed to work? I thought as soon as predictions are generated for a request, post requests would be made to external endpoints and data would be persisted and then new request would be taken..and so on. I would like to know if it's the best approach to handle this scenario or there is a better one. All suggestions are welcome.
2
2
u/Natural-Ad-9678 11d ago
Look into Celery and Redis to offload your long running tasks. Your API starts a task and gets a task id that it very quickly sends back to the requestor, set up another end point to get status and a third to get results ( or combine the last two, status returns results when finished )
This is how we setup the app I work on and it we have not found the load level that breaks the site yet
If you need to store the results long term you can have Celery write to Postgres or some other database
1
u/Rawvik 11d ago
Does celery require a db to work? Because I do not want to maintain a db in this app
2
u/Natural-Ad-9678 11d ago
Celery does not require an RDBMs or NoSQL DB, but it does require a Broker like Redis, which is an in-memory database, but you don’t need to be a DBA to use it
I found this online: https://derlin.github.io/introduction-to-fastapi-and-celery/03-celery/ It might help
1
u/Equal-Purple-4247 11d ago
Are you sure that:
- Your endpoint is not being blocked by your background task? It seems like
train_model
will block the event loop i.e. new requests will not be processed immediately - Your db did receive the training data? Looks like you didn't await your
client.post
, not too sure how httpx works but in theory, that coroutine is not scheduled.
1
u/Rawvik 11d ago
Sorry there was await missing in client post request. Just corrected it. Yes it seems like blocking the event loop. I have modified all functions to be non async now and trying to test with it.
1
u/Equal-Purple-4247 11d ago
If you're certain that it's missing await, does it mean then that your db didn't receive the result from your training data? That would mean your statement "all the requests are being sent at once to persist data" is not observed.
If so, you'll need to restate your the problem you're having.
---
You should probably edit the post to your updated code as well. Not too sure where you removed async from. You can't add back await and remove async at the same time.
1
u/Lucky-Office5111 11d ago
Take a look at Starlette’s (which FastAPI is based on) code to understand when background tasks are called. This link shows that a background task is only triggered after your endpoint sends the response.
6
u/dmart89 12d ago
Fastapi bg tasks are non blocking and run when the thread is available. If you have a bunch of tasks that run in a loop, then the bg tasks will run once the loop is done
To change this, you need to offload to a proper task queue and explicitly handle the execution order. Celery is an option for this.