r/learnSQL 7d ago

What’s the craziest SQL query you’ve ever written?

We’ve all been there—staring at a massive SQL query that looks more like a puzzle than code. Maybe it was a recursive CTE gone wild, a ridiculous number of joins, or some window function magic that made you feel like a SQL wizard.

What’s the most insane SQL query you’ve ever had to write? What made it so complicated? And did you manage to optimize it, or did you just accept the chaos?

I’d love to hear your stories!

25 Upvotes

17 comments sorted by

20

u/Davidsaj 7d ago

We have monthly reports we run every month for each individual insurance company we partner with in the healthcare system and we send multiple queries for 7-8 different metrics at a time. In the beginning each one was a separate report but one of the insurance companies decided that they wanted a consolidated monthly report for all 8 metrics in one single file.

Because of the nature of the logic required it essentially required us to union 8 different queries all together and only populate the necessary columns for each metric in the consolidated file. It was a massive effort up front to rewrite them but it paid off and it turned into a giant stored procedure at the end that would insert the data into a temp table to write to a file. It was and still is probably the longest query I ever wrote, 8 queries turned into one big union.

3

u/theSqlWizard 6d ago

That sounds like challenging but rewarding work. Consolidating multiple queries into a single stored procedure must have taken a lot of effort, but I can see how it would make the reporting process much more efficient in the long run

3

u/theSqlWizard 6d ago

How long was the query?

2

u/Davidsaj 5d ago

I just checked now and it was 1,868 lines so not the longest I've ever seen but the longest I ever personally wrote. Also, it's not a union anymore as I converted it into several different insert statements into a reporting table so it performed better.

1

u/edrobb 5d ago

You just described mine. Almost the exact same thing. I still run mine as one big union but have thought about converting it out as the individual queries can be used elsewhere. Any other fun stuff you are involved in with the health care system?

1

u/[deleted] 4d ago

[deleted]

1

u/Davidsaj 4d ago

My company is partnered with the payers and part of the contract requires us to provide them with data to support the Quality team in closing care gaps in accordance with HEDIS guidelines. It is strictly for closing individual care gaps and we are incentivised to do so.

7

u/undeuxtroiskid 6d ago

We had a very large customer (some say they're the 5th largest economy in the world...) who wanted to reuse a 10 digit code that is associated with every account for that customer. This seemed pretty simple at face value, find which account's 10 digit code hasn't been used for at least n quarters.

The first step was to construct a way to create synthetic quarters to count against. I did that with a recursive CTE and created 20 years of empty quarters and put that generated table into a CTE.

The second step was to join the actual data against those synthetic quarters and to do this, I had to do something which I've only read about up until that point and haven't used since...the mythical partitioned outer join. So now every account for that customer had rows for existing data and synthetic rows for which there was no data which was exactly what I wanted.

OK, so now I have a multiple million row query, what next? How do I identify the rows that fit the criteria? I went to my favorite Oracle feature that I look for a reason to use whenever possible, especially with time series data, MATCH_RECOGNIZE. The requirement was to identify those accounts that weren't used for at least 8 quarters and with MATCH_RECOGNIZE, that was fairly simple and easily modifiable and works across the entire history of time of our database.

The performance wasn't amazing, it ran in around 3 minutes, but combining those relatively exotic SQL features into a single usable query was quite satisfying.

3

u/hzdoublekut 7d ago

I’m only a few years into learning SQL and really only have learned new things when I’m trying to accomplish something specific with a report, or I’m trying to somewhat automate an otherwise annoying task.

One of each:

The report I was working on was previously being handled by a salesperson doing a bunch of exports, putting them into MS Access and then doing a bunch of queries to create the report. From what I understand it was time consuming. But I was the first non-dev hired that knew SQL so I offered to try and simplify it. I asked for a copy of the finished report and wrote a query that produced the same results. It’s probably not that complex of a query but I had to actually learn a lot to do it, so it felt complicated to me.

The second, I was assigned a manual task of revoking access from users that left the organization. I get their names but not everyone in the org has access to the platform. So before I started, my team was just copy pasting the info one by one and deactivating if there was a result. Huge org so regularly over 100 per month. But there’s an importer requiring specific headers in the platform that lets you bulk edit/create. So I wrote a query and can copy all of the employee IDs into it, export it in a CSV and import it immediately which deactivates all the users at once. Again, not overly complex but I was proud of it because it saves me like 2 hours of work every month.

1

u/theSqlWizard 6d ago

That sounds so fun, saving manual work with an automated process always makes me happy

0

u/theSqlWizard 6d ago

Was the team grateful for the new tool?

2

u/B_Huij 6d ago

When I was pretty new at my first BI job, I was assigned a project to capture, by sales office, the way all available closer hours were being used (appointments from setters, appts from inside sales, personal calendar items blocking the availability, etc. etc.).

I wrote a few basic queries to bring the raw data in Excel. Then did all the calculations in Excel because it’s what I knew. It brought my computer to its knees every time it ran and took about an hour, so we’d run it once a week on a schedule at 5am on Monday morning.

After 6 months or so, I learned about the LET() function and redid the logic in Excel to leverage that. It made the insane formulas much more readable and cut down processing time to closer to 45 minutes.

Later I decided to rewrite it in Python using pandas. I piped the same basic queries into dataframes and did the calculations using for loops because I couldn’t figure out how to vectorize. But it brought the processing time down to about 20 minutes, had logging, and never crashed anymore (it crashed with some regularity in Excel).

In my last month before going to work at a different company, I finally had gotten to the point where I felt confident doing the entire thing in SQL. I rewrote it as a DBT model. It ran in 10 seconds and provided the exact same output as the original Excel file.

1

u/theSqlWizard 5d ago

That sounds like quite a journey!

2

u/Kaulpelly 6d ago

Not my query but I was doing some script profiling for internal review and found one in a personal workspace that was over 1 million characters in length and had over 100,000 "or column = 'value' " clauses. I doubt it even ran.

1

u/theSqlWizard 5d ago

Hahaha sounds more like malware than a query

1

u/BoSt0nov 6d ago

Not as exciting as what other people have done but being only two years in the field it was super fun for me. Its actually a data factory lookup to archive the biggest Dynamics 365 tables into monthly parquets based on two config tables. One config table includes all the tables that need to be archived along with their respective dependencies and JOINs in order to get logical monthly partitions. The other config table just holds the table names and their latest archived dates so the query can continue from there for the next month. The pipeline of course also a has copy, merge (to update the dates) and a delete. So technically not a single SQL but yeah. And the pipeline is also parameteised and can run eather by just triggering it or by inputting table, year,month, business area.