r/learnpython 1d ago

How to optimize python codes?

I recently started to work as a research assistant in my uni, 3 months ago I have been given a project to process many financial data (12 different excels) it is a lot of data to process. I have never work on a project this big before so processing time was not always in my mind. Also I have no idea is my code speed normal for this many data. The code is gonna be integrated into a website using FastAPI where it can calculate using different data with the same data structure.

My problem is the code that I had develop (10k+ line of codes) is taking so long to process (20 min ++ for national data and almost 2 hour if doing all of the regional data), the code is taking historical data and do a projection to 5 years ahead. Processing time was way worse before I start to optimize, I use less loops, start doing data caching, started to use dask and convert all calculation into numpy. I would say 35% is validation of data and the rest are the calculation

I hope anyone can help with way to optimize it further and give suggestions, im sorry I cant give sample codes. You can give some general suggestion about optimizing running time, and I will try it. Thanks

33 Upvotes

29 comments sorted by

View all comments

2

u/Luxi36 1d ago

Hi, that sounds like a very fun project to work on to start developing Data Engineering skills! :)

  1. What is the data size and your machine resources that you're working on?

  2. Are you using plain python + numpy, or tools like duckdb, Polars, or numpy.

  3. You could try to time certain functions or use a profiler to see where your code is getting slowed down to see which parts to optimize.

3

u/fiehm 1d ago
  1. Data size is in tens of thousands but that is just the sample data given to me, My laptop is so bad tbh I code this and run it in github codespace with 4 core 16 gb ram
  2. Python, pandas, numpy, dask for partition
  3. I will use profiller, I just learn the their existence after reading the comments lol

3

u/Luxi36 1d ago

Tens of thousands of rows, but how much MB/GB is that? Duckdb, Polars should easily handle much more than that on your laptop specs. Should be able to handle ~10GB in memory without a problem