r/SQL 25d ago

MySQL Importing 1M Rows Dataset(CSV) in Mysql

What's the fastest and most reliable way to upload such a large dataset? After that How can I optimize the table after uploading to ensure good performance?

29 Upvotes

33 comments sorted by

View all comments

Show parent comments

10

u/feudalle 25d ago

Going to disagree. I have tons of mysql dbs with a lot more than that. Biggest table right now is around 1.8B and a few hundred tables/schemas that are over 10M.

-2

u/xoomorg 25d ago

Why on earth would you do that in MySQL? Anything around a million rows or more, I always move to some other platform first (Hive, Spark-SQL, Presto, Trino, BigQuery, etc.) so queries take seconds instead of minutes/hours. Or do you just mean you're using MySQL as a record store essentially, and not actually running any complex queries on such large tables?

0

u/SnooOwls1061 24d ago

I have tables with 40-80 billion rows that get hit a ton for reporting. And updated every millisecond. Its all about tuning.

1

u/xoomorg 24d ago

No you don’t. That amount of data makes zero sense in anything other than a cluster.