top of page
Search

Lightroom's AI denoise moves the hardware goalposts

Stephen Knight

Updated: Dec 16, 2024

Until recently, Adobe Lightroom Classic was more of a heavy user of a computer's CPU and RAM, than the graphics processing unit (GPU). However, the AI denoise functionality introduced in 2023 has significantly changed the goalposts. The AI denoise functionality is extremely reliant on powerful "AI enabled" GPUs. So which graphics cards work well, and which don't? There are only limited benchmarks available, so this article is based on a literature review of benchmarks and reading rather a lot of forum posts.


Benchmark references include (but are not exclusive to):


It would be useful if GPU comparison benchmarks (such as this mobile GPU test from Notebook Check) could run Lightroom Classic specific tests (including AI denoise), instead of just a plethora of 3D gaming tests. We can all wish!


ISO1600 light painting which used AI denoise
ISO1600 light painting which used AI denoise

ISO8000 portrait which used AI denoise
ISO8000 portrait which used AI denoise. Model @tay.tay.x_

Integrated GPUs


Integrated GPUs (iGPUs) generally do not perform well with Lightrooms AI denoise, with typical denoise times being in excess of 5 minutes per RAW file. The main exception is the Apple M series "system on a chip", where performance varies between OK (approx. 1-3 minutes) to excellent (<30secs). The more recent (M2 and M3), and higher specified models (Pro, Max and Ultra) perform the best. The M4 series is arriving on MacBook Pros in late 2024.


For Windows PCs and laptops, I have yet to see any evidence of iGPUs that can consistently denoise a typical RAW file in less than 1 minute (update - a reader has recorded sub 30 second denoise times with an AMD 790M). The newer iGPU (Radeon 890M) in the AMD Ryzen AI 9 HX 370 processors and Intel iGPU (Arc 140V) in the Core Ultra 200V "Lunar Lake" processors are significant improvements over previous generations of iGPUs. As there are currently no benchmarks available. I would be extremely cautious relying on these iGPUs for AI denoise. Laptops with these Copilot+ processors are starting to become available with an additional dedicated GPU (such as Nvidia RTX 4060), which may be a good option if you want decent GPU power, Copilot+ PC AI neural processing units (NPUs), and decent graphic performance.


Lightroom Classic currently only runs on ARM based Qualcomm Snapdragon X processors via emulation, so I would give them a miss until an ARM native version of Lightroom Classic is available. Even then, the Snapdragon X's iGPU performance, whilst good for an iGPU, is likely to be mediocre for AI denoise.


2025 may be promising for iGPUs running Lightroom Classic's AI denoise, with multiple game changing "system on a chip" products expected, giving Windows laptops Apple M series like graphics performance. The AMD Strix Halo APU (likely to be named Ryzen AI Max 385/390/395) will have a much beefier iGPU (and price tag), with possibly better performance than an Nvidia RTX 4060 dGPU. MediaTek are known to be collaborating with Nvidia to create a "AI PC system on a chip" combining MediaTek ARM CPUs, and Nvidia iGPUs, expected in the second half of 2025. There may also be a rumoured Intel "Arrow Lake Halo" product in the works. Lets hope that Adobe optimises Lightroom for ARM processors and also make use of Copilot+ PC NPUs for additional processing power.




Discrete/dedicated GPUs


Discrete/dedicated GPUs (dGPUs) from the last 5 years will provide performances varying from mediocre (a few minutes) to excellent (<15 seconds). I personally have a 4.5 year old laptop with a Nvidia GTX 1660 Ti which takes between 2 to 7 minutes to denoise a RAW file. There are many benchmarks for Nvidia GeForce cards. There is a huge improvement for RTX cards over older GTX cards, with performance generally improving with each RTX generation (20xx, 30xx, 40xx), and with each increment in power and price (xx60, xx70, xx80, xx90). There are no benchmarks for xx50 series, so I would be cautious if buying these for AI denoise as non Lightroom specific benchmark testing shows only 20-35% better gaming performance over the aforementioned GTX 1660 Ti. If buying new, I would advise a RTX 4060 or better. The RTX 50xx Blackwell series is coming in late 2024 and early 2025, and should improve performance further, with possibly more power efficiency on the lower end models.


AI denoise maxing out a GTX 1660 Ti GPU
AI denoise maxing out a GTX 1660 Ti GPU


There are fewer benchmarks for AMD Radeon dGPUs. There are a few reports that RX 7xxx series GPUs have "AI Accelerators" which work very well with Lightroom AI denoise. Older RX 6xxx series GPUs have a mixed bag of reviews with AI denoise. The Radeon RX8000 series GPUs are expected in late 2024/early 2025.


Intel Arc dGPUs initially had a few bugs which prevented us of AI denoise, but were resolved after a few months. Intel Arc dGPU's AI denoise performance has a couple of mixed reviews.


For all dGPUs, make sure that you are using the latest drivers, and make sure that the dGPU is being used by Lightroom in Edit>Preferences>Performance (noting that some PCs have both iGPUs and dGPUs). You can also see which GPU is being used by AI denoise in Windows Task Manager when performing AI denoise.

A half decent CPU, and RAM (16GB minimum, 32GB+ recommended) are also useful for decent Lightroom Classic performance. A 100% sRGB display, and preferably a 100% DCI-P3 display are also recommended.


Conclusion


If you are likely to be a frequent user of AI denoise in Lightroom, and don't want to go for a long walk whilst waiting for the denoising of a batch of photos, then I would advise using AI enabled dGPUs such as Nvidia RTX40xx series (preferably RTX4060 or better), and maybe AMD Radeon RX7000 series. Aside from Apple M series laptops (preferably Pro, Max and Ultra models), I would avoid iGPUs, more recent iGPUs "may" be OK. Things will change for the better in 2025 with more powerful iGPUs for Windows devices expected.


If you have any test results to add for recent GPUs, please use the comments section!


Links


Help support this website by donating to:





1,307 views2 comments

Recent Posts

See All

2 commenti


Timo S.
Timo S.
24 ott 2024

I think your judgement on AMD iGPUs is a bit pessimistic or harsh. It all depends on your use case and raw files. I have an AMD Ryzen 8700G with an integrated Radeon 780M. Lightroom AI denoise on my raw photos (up to 26MP) takes roughly half a minute with the iGPU. But it really depends on your cameras or raw files. Here are some numbers for performing AI denoise on sample raw files from my various cameras:


Fuji X-T3 (26MP): 33s


Fuji X-E2s (16MP): 22s


Fuji X100 (12MP): 20s


Nikon D7000 (16MP): 28s


And for reference, a high megapixel camera (I don't have one, so I just used a sample DNG file from the web for the test):


Leica…


Mi piace
Stephen Knight
15 dic 2024
Risposta a

That's really useful feedback and certainly better than expected based on other benchmark testing. I'll update the article. I welcome testing by other readers on other iGPUs as well.

Mi piace
bottom of page