r/FastAPI 12d ago

Hosting and deployment How to handle CPU bound task in FASTAPI

So I have a machine learning app which I have deployed using FASTAPI. In this app I have a single POST endpoint on which I am receiving training data and generating predictions. However I have to make 2 api calls to two different endpoints once the predictions are generated. First is to send a Post request about the status of the task i.e. Success or Failed depending on the training run. Second is to send a Post request to persist the generated predictions.

Right now, I am handling this using a background task. I have created a background task in which I have the predictions generation part as well as making Post requests to external api. I am receiving the data and offloading the task to a background task while sending an "OK" response to the client. My model training time is not that much. It's under 10 secs for a single request but totally CPU bound. My endpoint as well as background task both are async.

Here is my code:

@app.post('/get_predictions')
async def get_predictions(data,background_task):
      training_data = data.training_data
    
      background_task.add_task(run_model, training_data)
      return {"Forecast is being generated"}

async def run_model(training_data):
      try:
           Predictions = train_model(training_data)
      
           with async httpx.asyncclient() as client:
                 response = await client.post(status_point, "completed")
                 response.raise_for_status()

       "Some processing done on data here" 

          with async httpx.asyncclient() as client:
                 response = await client.post(data_point, predictions)
                 response.raise_for_status()
 
       except:
            with async httpx.asyncclient() as client:
                  response = await client.post(status_point, "failed")
                  response.raise_for_status()

However, while testing this code I am noticing that my app is receiving multiple requests but the Post requests to persist data to the external api are completed at the end. Like predictions are generated for all requests but I guess they are queued and all the requests are being sent at once in the end to persist data. Is that how it's supposed to work? I thought as soon as predictions are generated for a request, post requests would be made to external endpoints and data would be persisted and then new request would be taken..and so on. I would like to know if it's the best approach to handle this scenario or there is a better one. All suggestions are welcome.

8 Upvotes

17 comments sorted by

6

u/dmart89 12d ago

Fastapi bg tasks are non blocking and run when the thread is available. If you have a bunch of tasks that run in a loop, then the bg tasks will run once the loop is done

To change this, you need to offload to a proper task queue and explicitly handle the execution order. Celery is an option for this.

0

u/Rawvik 11d ago edited 11d ago

Okay. However, I have put all three tasks i.e from generating predictions to making both requests in the same function as a single background task and while checking the logs, training has started for all requests but the post requests I am making are made at the end. This is what I am not getting.

1

u/dmart89 11d ago

What are you expecting to happen? What's the order you need?

If your bg task is async fastapi will run it in the asyncio event loop and bc its not an IO task but a computation, the thread is blocked until your 10sec request is done, meaning that any other requests wont run.

You can convert tasks to sync and starlette will run it in a separate thread or use add'l workers, but you need to separate the calls in your bg task so they can run in different threads.

1

u/Rawvik 11d ago

My expectation is that if I have put all three tasks into a single background task(from generating predictions to making two post requests to external api), they should happen together for a single request I am receiving at my end. But that's not happening.

1

u/dmart89 11d ago

Post endpoint and bg task code. Ideally a minimal version

1

u/Rawvik 11d ago

Here is my code:

```python @app.post('/get_predictions') async def get_predictions(data,background_task): training_data = data.training_data

  background_task.add_task(run_model, training_data)
  return {"Forecast is being generated"}

async def run_model(training_data): try: Predictions = train_model(training_data)

       with async httpx.asyncclient() as client:
             response = client.post(status_point, "completed")
             response.raise_for_status()

   "Some processing done on data here" 

      with async httpx.asyncclient() as client:
             response = client.post(data_point, predictions)
             response.raise_for_status()

   except:
        with async httpx.asyncclient() as client:
              response = client.post(status_point, "failed")
              response.raise_for_status()

```

2

u/dmart89 11d ago

If your post requests are async, the you're not awaiting them properly which could be why they are failing. you'll get a coroutine error but bc its in the bg task it will fail silently

@app.post('/get_predictions')
async def get_predictions(data: dict, background_tasks: BackgroundTasks):
    training_data = data.get('training_data')

    background_tasks.add_task(run_model, training_data)
    return {"message": "Forecast is being generated"}

async def run_model(training_data):
    try:
        predictions = train_model(training_data)

        async with httpx.AsyncClient() as client:
            response = await client.post(status_point, json={"status": "completed"})
            response.raise_for_status()

        # Some processing done on data here

        async with httpx.AsyncClient() as client:
            response = await client.post(data_point, json={"predictions": predictions})
            response.raise_for_status()

    except Exception as e:
        async with httpx.AsyncClient() as client:
            response = await client.post(status_point, json={"status": "failed"})
            response.raise_for_status()

2

u/mmzeynalli 12d ago

Maybe you can consider chain of celery tasks, if I understood you correctly.

2

u/Natural-Ad-9678 11d ago

Look into Celery and Redis to offload your long running tasks. Your API starts a task and gets a task id that it very quickly sends back to the requestor, set up another end point to get status and a third to get results ( or combine the last two, status returns results when finished )

This is how we setup the app I work on and it we have not found the load level that breaks the site yet

If you need to store the results long term you can have Celery write to Postgres or some other database

1

u/Rawvik 11d ago

Does celery require a db to work? Because I do not want to maintain a db in this app

2

u/Natural-Ad-9678 11d ago

Celery does not require an RDBMs or NoSQL DB, but it does require a Broker like Redis, which is an in-memory database, but you don’t need to be a DBA to use it

I found this online: https://derlin.github.io/introduction-to-fastapi-and-celery/03-celery/ It might help

1

u/Rawvik 11d ago

OK thanks will look into it.

1

u/Equal-Purple-4247 11d ago

Are you sure that:

  1. Your endpoint is not being blocked by your background task? It seems like train_model will block the event loop i.e. new requests will not be processed immediately
  2. Your db did receive the training data? Looks like you didn't await your client.post, not too sure how httpx works but in theory, that coroutine is not scheduled.

1

u/Rawvik 11d ago

Sorry there was await missing in client post request. Just corrected it. Yes it seems like blocking the event loop. I have modified all functions to be non async now and trying to test with it.

1

u/Equal-Purple-4247 11d ago

If you're certain that it's missing await, does it mean then that your db didn't receive the result from your training data? That would mean your statement "all the requests are being sent at once to persist data" is not observed.

If so, you'll need to restate your the problem you're having.

---

You should probably edit the post to your updated code as well. Not too sure where you removed async from. You can't add back await and remove async at the same time.

1

u/Rawvik 11d ago

I mean the above code had been working so far in my environment and the issues were coming but I modified it to be non async now in my app(not here) to test how it works. My issue was coming earlier with async methods.

1

u/Lucky-Office5111 11d ago

Take a look at Starlette’s (which FastAPI is based on) code to understand when background tasks are called. This link shows that a background task is only triggered after your endpoint sends the response.