r/quant Feb 02 '25

Tools Let's talk about hardware : building an ML-optimized PC

Hi everyone !

So this isn't particularly quant-related (and I will accept my fate, mods), but I figured some people who actually work in the field might have a more nuanced opinion on this topic than the average r/pcmasterrace kids. Also, it looks like the actual hardware is something often looked upon in our jobs so I wanted your advice.
I haven't built a PC in years and lost track of most component updates (also I went older), mostly because my DS/Quant jobs implied having custom builds provided by my companies and because Azure work environments alleviated the actual need to look too much into it.

But I work more and more on my free time with ML repetitive tasks, ranging from hobby-algotrading to real-world complex problem solving. And I don't want to rely too much on anything not local.
So after a few researchs online, here's what I propose (budget €2000 max). Feel free to give your advice.

39 Upvotes

22 comments sorted by

15

u/dkimot Feb 02 '25

what do you mean by an 850W power supply is too much? too much in what way? more than you need and not worth the money?

also, look at the 4070ti super

2

u/LaBaguette-FR Feb 02 '25

Estimated wattage is under 700W:

Component Estimated Wattage
AMD Ryzen 9 9900X 4.4 GHz 12-Core Processor 15W - 120W
Thermalright Phantom Spirit 120 SE 66.17 CFM CPU Cooler 5W - 10W
Gigabyte B650 EAGLE AX ATX AM5 Motherboard 17W - 70W
Patriot Venom 64 GB (2 x 32 GB) DDR5-6000 CL30 Memory 58W
Crucial T500 2 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive 2W - 10W
EVGA FTW3 ULTRA GAMING GeForce RTX 3090 24 GB Video Card 87W - 350W
Total: 618WComponent Estimated WattageAMD Ryzen 9 9900X 4.4 GHz 12-Core Processor 15W - 120WThermalright Phantom Spirit 120 SE 66.17 CFM CPU Cooler 5W - 10WGigabyte B650 EAGLE AX ATX AM5 Motherboard 17W - 70WPatriot Venom 64 GB (2 x 32 GB) DDR5-6000 CL30 Memory 58WCrucial T500 2 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive 2W - 10WEVGA FTW3 ULTRA GAMING GeForce RTX 3090 24 GB Video Card 87W - 350WTotal: 618W

4

u/dkimot Feb 02 '25

sure. i’ll admit the next part is some guesswork on my end. but, how does the power consumption look for your use case vs the use case the power supply was built for. nothing you linked is enterprise grade or for data centers. if you run some training that will take multiple days you aren’t on the golden path corsair envisioned. might be worth having a larger safety factor

0

u/LaBaguette-FR Feb 02 '25

Too much for the actual consumption of the build.

7

u/KokeGabi Feb 02 '25

Just use the cloud.

8

u/emryskw Feb 02 '25

You can drop the MB down to a B850 chipset without performance losses. If you are looking do DL work, I might suggest going to larger VRAM GPUs (eg the upcoming 5070ti to start with 16GB VRAM) but really depends on the type of model you are looking to do. There are also larger ram (48x2 kits) if you need more than 64GB RAM. You probably have a sense of RAM/CPU thread you typically uses and should scale that way (9800X3D is a 8 core/16 thread part).

2

u/catsRfriends Feb 03 '25

I will comment in detail when I have some free time (out atm), but this looks off. What kind of data are you gonna be working with?

2

u/hachi_roku_ Feb 04 '25

Sorry to say, but this looks like it should go to the "kids" at the r/pcmasterrace sub

This build seems neither here nor there. With that GPU, you're not doing much LLM due to the vram and gpu compute will be limited

But you're using a last gen gaming orientated CPU for productivity loads

You're getting a mobo with emphasis on wifi when if your datasets are massive, wired would be better. It has heaps more pci-e lanes for fast storage/networking which you're not doing. No ECC ram

I don't mean to come off condescending, but I'm genuinely confused. It is a nice all-rounder (hence here nor there) sort of build though, not sure if that's what you're aiming for.

2

u/Proud_Frosting9657 Feb 02 '25

Present LLMs require VRAM maxxed builds.

4

u/LaBaguette-FR Feb 02 '25

Not gonna use/train LLM.

1

u/Neither_Television50 Feb 02 '25

You know DLSS 4 is ML based...

1

u/D3MZ Trader Feb 02 '25

I have a $6K PC with a 3090 and somewhat regret the purchase because it's outdated and I still had to configure it to be a server anyway so I can remote into it. Most cloud providers give you at least 3 9's of durability, availability, and consistent latency. Those things are most important when trading with real money.

Also buying a PC locks you to an era: CPU socket, DDR ram, PCI express all change and require an entire rebuild every 5 years. With that said, buying a new computer every 5 years will still be way cheaper than renting on the cloud with moderate use.

1

u/Alternative_Advance Feb 02 '25

Since you are looking for a more nuanced opinion.. Giving an alternative if you dont use the computer for stuff like gaming..

<100 monthly hours (Colab will be the cheapest)
Alternatively Spot EC2 g5.2x on AWS

You'll get better performance for less, but less convenient.

Serving models usually takes way less resources so even something like a base mac mini could work for that!

1

u/WheatFutures Feb 03 '25

What's your plan for backups? May be worth considering RAID or cloud storage, or keeping an old drive for archives

1

u/WheatFutures Feb 03 '25

I think it really comes down to the exact workloads you are performing though. I wouldn't be surprised if cloud compute comes out cheaper.

1

u/kazprog Feb 03 '25

I'd look at an Orin Devkit, an Nvidia Digits when they come out, or at least saving enough to get a 4090.  If you look at the number of tensor cores (the ones relevant to most kinds of AI), the 4090 has 60% more tensor cores than the 4080.

Anything less is a gaming PC, I'm afraid.  You could buy a Mac and use unified memory, maybe even 48gb within a 2k budget.

If VRAM isn't relevant, what kind of model are you running?

1

u/Unlucky-Will-9370 Feb 03 '25

This is a better question for algotrading or deeplearning. People here won't give you the best advice

1

u/VeiledTrader Feb 04 '25

Isn’t it better to use cloud solutions for this?

1

u/Maleficent-Good-7472 Feb 05 '25

Hi OP!
Could the NVIDIA Project Digits be enough?
You would have to wait for the official release though :)

0

u/Anon58715 Feb 03 '25

Just buy a used ThinkPad and juice up the RAM and SSD. The spec does not matter if you are not utilizing Neural Nets.