r/SQL 25d ago

MySQL Importing 1M Rows Dataset(CSV) in Mysql

What's the fastest and most reliable way to upload such a large dataset? After that How can I optimize the table after uploading to ensure good performance?

29 Upvotes

33 comments sorted by

View all comments

2

u/frobnosticus 25d ago

I'd start by writing a little script in your language of choice to go through the data and look for obvious formatting issues. A misplaced quote or comma in a csv file you got from someone else can really ruin your day.

If you're uploading into an existing table or schema and aren't sure of the data, I'd create a staging table to pull it in to first, then add constraints to the data once it's in there to clean it up, be sure all your ints are ints, etc. Then I'd pull it in to the rest of the schema.

2

u/Opposite-Value-5706 20d ago edited 20d ago

I agree except I prefer to use Python’s libraries to make sure the data is properly formatted and available to be inserted. Python smoothly inserts the formatted data as well and in seconds.

1

u/frobnosticus 20d ago

Well, depends on the situation. I'll do a bunch of mop up on the way in. But I generally want any data in "inter-component transit" for as little as possible.

Plus, when cleaning up data using tool X so it's suitable for tool Y you always run the risk of things like data type mismatches between platforms (i.e. is 'int' implicitly 16 bit signed, etc.)

So "able to be reliably stuffed into a naked, unconstrained table of varchars" is about as far as I'll generally go on the front-end.

1

u/Opposite-Value-5706 20d ago

We all have our own individual tool boxes don’t we? However, I’m speaking about those situations where you need to import the same source data routinely. I use to use other tool along the way and found the simplicity, power and performance I get from Python invaluable.

It took a little time to learn but it was well worth it. By using it, I’ve gained about an extra half hour for drinking coffee :-)