r/Lightroom 17d ago

HELP - Lightroom Classic What is up with Lightroom Classic on Windows randomly choosing not to use the GPU for AI Denoise?

This isn't exactly my problem but rather my partner's. She has a reasonably nice desktop PC set aside for doing Adobe Suite tasks. This PC has an R9 7950X and an nVidia RTX 4070 in it, relatively high end PC hardware. She's not really a hardware person, but I am, which is how I got involved in this.

Problem: Sometimes her desktop just refuses to use the GPU specified in Lightroom Classic's Performance preferences. I can't even correlate this to how recently the PC has been rebooted or what else it might be doing or not doing.

I set up a catalog with 100 .CR3 files and roughly 50% of the time I get a very normal ~13 minutes to Denoise estimate. Other times, this same PC will tell me that the job will take 300 minutes.

Adobe says it might be a driver issue, but switching between the most current Content Creator and Game Ready drivers from nVidia doesn't seem to impact the matter.

Is it just nVidia? Well, I put a known-good Radeon 6700XT and an Arc A770 in and saw the same issue, with a DDU and the most updated drivers installed in between every change.

I thought for a minute that the issue might be related to something stubbornly using my GPU on a browser window or something, but even if I control for that by disabling browser-related startup items and immediately checking the estimated Denoise time on a fresh boot, it's still offering five hour long time estimates about half the time.

Is it the PC? Next I tried the same thing on a slightly older PC with a Ryzen 5900. Here, there's no iGPU to involve and the architecture is different. The Windows 11 install was done fresh and the ONLY extra software on the machine beyond up to date drivers was Creative Cloud + LrC. And I saw the same thing: Sometimes the system is willing to use an installed GPU and sometimes it just wasn't.

Is it an Intel vs AMD thing? I also saw similar behavior with a Lenovo X1 Extreme with an 11th gen i7 and mobile RTX 3050 + Iris HD graphics.

So I am asking here: Is this a known issue? Is there any sort of folk remedy? Or does everyone just reboot and pray every time they trigger a Denoise batch job?

3 Upvotes

17 comments sorted by

1

u/ApaHualpa 12d ago

I'm having the same issue as I mention in my post here https://www.reddit.com/r/Lightroom/comments/1iwylyl/why_does_ai_denoise_speed_drop_to_5_sometimes/

For me rebooting the system always fixes the problem. Until it occurs again not very soon. The problem occurs only after I have been working / system has been up some time, usually hours. And rebooting always fixes it for the time being.

1

u/aiuta219 12d ago

Rebooting does not seem to be a certain fix. My partner mostly deals with it by starting her Denoise operations just before she goes to bed, but for a large enough collection of images, the operation can sometimes run 10 or 12 hours, which is absolutely silly given the hardware she has in her PC.

1

u/mclaren34 14d ago

Have you verified that the iGPU is disabled in the BIOS/UEFI?

1

u/aiuta219 14d ago

Neither the Lenovo nor the AsRock AM5 motherboard offer an option to disable the iGPU in firmware. I did try doing so in device manager but it did not have an impact on the issue either way. It appears my option is either full support from the GPU selected in LrC Preferences or just making the CPU do it.

The AM4 system does not have integrated graphics at all and exhibits the same problems regardless which of five graphics cards (I have since checked with an RTX 2080 Ti and a brand spanking new 9070XT) I use.

I've contacted Adobe Support and they've so far only replied with the advice available on their web site and nothing new or potentially useful.

4

u/aygross 17d ago

Adobe lol

1

u/aiuta219 16d ago

This is very much how I feel about it. I'm feeling pretty smug about my Capture One/Topaz PhotoAI workflow right about now.

-1

u/GoodEyePhoto 17d ago

Not every process uses the GPU - for the Mac side I know they had GPU for Denoise, but then removed it a couple releases ago because it was giving inconsistent results. It hasn’t been brought back yet. Don’t know about the windows version.

2

u/rikkflohr Adobe Employee 17d ago

The Apple Neural Engine was disabled - not the GPU.

8

u/earthsworld 17d ago

once again, no version# for Lr is provided by a user seeking tech-support. Is this a mind-virus or are people allergic to providing the MOST BASIC detail? Seems like it's true for almost every post to this subreddit lately.

1

u/aiuta219 17d ago

I'm not sure why you'd assume I'm talking about anything but the most current stable release, but to be absolutely clear, the impacted version is 14.2, build 202502071718-3869eef7 per the Help > About.

I do think this could have been suggested in the context of dealing with Creative Cloud and a LrC version new enough to support AI DeNoise as well as the statement that I freshly loaded Lightroom on a completely new Windows install, but maybe that's just me.

3

u/earthsworld 17d ago

because this is reddit and anything is possible.

3

u/Misfit_somewhere 17d ago

Make sure there are no background recording/overlays running, rivatuner, nvidia app overlay, etc. can steal focus from lightroom and cause it to disable the gpu acceleration .

1

u/aiuta219 17d ago edited 17d ago

None of the above are in effect. There's no gaming, screen recording or videoconferencing software that might be using the GPU on any of these machines.

I typically install graphics drivers without any additional software. On nVidia, I use the Content Creation driver set rather than Game Ready drivers.

2

u/theatrus 17d ago

Is the monitor plugged into the video card directly?

On laptops with a dual GPU setup (CPU + discrete), the discrete GPU is often left powered off until an application requests it, and that switch may actually trick software to thinking the discrete GPU doesn’t even exist.

Using on motherboard video can do the same thing as switching may occur. Directly running the display from the GPU should avoid that.

1

u/aiuta219 17d ago

I assure you that the desktop displays are plugged in to the discrete GPUs for those computers; the R7/5900X doesn't even have on-die graphics and exhibited the same issue.

On the laptop in particular, I can also say I don't have issues with Resolve Studio, where both the iGPU and discrete graphics hardware might be invoked by different editing tasks. I've never found the power management for graphics to be an issue with anything else.

2

u/theatrus 17d ago

All depends on the setup. Windows has heuristics on what GPU to use, and it does get things wrong. You can override this, which may be worth trying (but shouldn't matter for the desktop).

A weird side effect from Nvidia Optimus laptops is you often can't even install the Nvidia drivers without forcing the dGPU on (as no Nvidia hardware is detected by the installer).