top of page
Search

Lightroom's AI denoise moves the hardware goalposts

Stephen Knight

Updated: 3 days ago

This article was originally published in September 2024, and has been updated in March 2025 to reflect new graphics card releases,


Until recently, Adobe Lightroom Classic was more of a heavy user of a computer's CPU and RAM, than the graphics processing unit (GPU). However, the AI denoise functionality introduced in 2023 has significantly changed the goalposts. The AI denoise functionality is extremely reliant on powerful "AI enabled" GPUs. It is likely that is uses GPU cores aimed for AI use - Tensor cores in Nvidia and Matrix cores in AMD. So which graphics cards work well, and which don't? There are only limited benchmarks available, so this article is based on a literature review of benchmarks and reading rather a lot of forum posts.


Benchmark references include (but are not exclusive to):


It would be useful if GPU comparison benchmarks (such as this mobile GPU test from Notebook Check) could run Lightroom Classic specific tests (including AI denoise), instead of just a plethora of 3D gaming tests. We can all wish!


ISO1600 light painting which used AI denoise
ISO1600 light painting which used AI denoise

ISO8000 portrait which used AI denoise
ISO8000 portrait which used AI denoise. Model @tay.tay.x_

Integrated GPUs


Integrated GPUs (iGPUs) generally do not perform well with Lightrooms AI denoise, with typical denoise times being in excess of 5 minutes per RAW file. The main exception is the Apple M series "system on a chip", where performance varies between OK (approx. 1-3 minutes) to excellent (<30secs). Performance has improved with each generation (M2, M3, and M4), with decent performance in higher specified models (Pro, Max and Ultra) which have the highest number of GPU cores. The M5 series is likely to arrive in MacBook Pros in late 2025.


For Windows PCs and laptops, I have yet to see much evidence of iGPUs that can consistently denoise a typical RAW file in less than 1 minute (update - a reader has recorded sub 30 second denoise times with an AMD 790M). The newer Radeon 890M in the AMD Ryzen AI 9 HX 370 processors, Intel Arc 140V in the Core Ultra 200V "Lunar Lake" processors, and Intel Arc 140T in the Intel Arrow Lake H/HX processors are significant improvements over previous generations of iGPUs. As there are currently no benchmarks available. I would be cautious relying on these iGPUs for AI denoise, with the Radeon 890M being the best bet for adequate performance.


Lightroom Classic currently only runs on ARM based Qualcomm Snapdragon X Copilot+ processors via emulation, so I would give them a miss until an ARM native version of Lightroom Classic is available. Lightroom (non-classic) can run on ARM CPUs but lacks some of Lightroom Classic's useful features such as brush presets. The Snapdragon X's iGPU performance, whilst good for an iGPU, is likely to be mediocre for AI denoise.


2025 may be promising for iGPUs running Lightroom Classic's AI denoise, with multiple game changing "system on a chip" products expected, giving Windows laptops Apple M series like graphics performance. The AMD Strix Halo APU, officially called Ryzen AI Max+ 385/390/395, will have much beefier (in terms of power and price) iGPU models Radeon RX 8050S and 8060S. These may have better performance than a Nvidia RTX 4060 dGPU.


MediaTek are known to be collaborating with Nvidia to create a "AI PC system on a chip" combining MediaTek ARM CPUs, and Nvidia iGPUs, expected in late 2025. There has been a rumoured Intel "Arrow Lake Halo" product in the works, but rumours have gone quiet recently. Lets hope that Adobe optimises Lightroom Classic for ARM processors and also make use of Copilot+ PC NPUs for additional processing power.




Discrete/dedicated GPUs


Discrete/dedicated GPUs (dGPUs) from the last 5 years will provide performances varying from mediocre (a few minutes) to excellent (<15 seconds). I personally have a 5 year old laptop with a Nvidia GTX 1660 Ti which takes between 2 to 7 minutes to denoise a RAW file. It lacks any Nvidia Tensor cores that may be used for AI Denoise. There are many benchmarks for Nvidia GeForce cards. There is a huge improvement for RTX cards over the older GTX cards, with performance generally improving with each RTX generation (20xx, 30xx, 40xx), and with each increment in power and price (xx60, xx70, xx80, xx90). There are limited benchmarks for xx50 series, so I would be slightly cautious if buying these for AI denoise, though they are still likely to perform better than most iGPUs. If buying new, I would advise an RTX 4060 or better. The RTX 50xx Blackwell series is the process of being released, with possibly more power efficiency on the lower end models. These all appear top have an increase in Tensor cores over the RTX 40x predecessors, so I'm expecting a significant improvement in AI Denoise performance.


AI denoise maxing out a GTX 1660 Ti GPU
AI denoise maxing out a GTX 1660 Ti GPU

There are fewer benchmarks for AMD Radeon dGPUs. There are a few reports that RX 7xxx series GPUs have "AI Accelerators" which work very well with Lightroom AI denoise, and the limited benchmarks are generally very good. Older RX 6xxx series GPUs have a mixed bag of reviews and benchmarks with AI denoise. The Radeon RX9000 series GPUs are expected in early 2025.


Intel Arc dGPUs initially had a few bugs which prevented us of AI denoise, but were resolved after a few months. Intel Arc dGPU's AI denoise performance has a couple of mixed reviews and benchmarks.


For all dGPUs, make sure that you are using the latest drivers, and make sure that the dGPU is being used by Lightroom in Edit>Preferences>Performance (noting that some PCs have both iGPUs and dGPUs). You can also see which GPU is being used by AI denoise in Windows Task Manager when performing AI denoise.

A half decent CPU, and RAM (16GB minimum, 32GB+ recommended) are also useful for decent Lightroom Classic performance. A 100% sRGB display, and preferably a 100% DCI-P3 display are also recommended.


Conclusion


If you are likely to be a frequent user of AI denoise in Lightroom, and don't want to go for a long walk whilst waiting for the denoising of a batch of photos, then I would advise using AI enabled dGPUs such as Nvidia RTX series (preferably RTX4060 or better), and AMD Radeon RX 7000-9000 series. Aside from Apple M series laptops (preferably Pro, Max and Ultra models), I would generally avoid iGPUs, though more recent iGPUs "may" be acceptable. Things will change for the better in 2025 with more powerful iGPUs for Windows devices (such as AMDs Strix Halo) are expected.


If you have any test results to add for recent GPUs, please use the comments section!


Links


Help support this website by donating to:





3 Comments


Damir Maksan
Damir Maksan
Feb 17

"I would be extremely cautious relying on these iGPUs for AI denoise"

There is no need to be cautious. I agree with Timo S - good iGPUs, such as recent ones from AMD and Apple are pretty decent at denoising. I have been using an AMD mini-PC (with a Radeon 780M iGPU) to denoise 24MP RAW files in around 35s. I suspect the latest AMD (Ryzen AI 9 HX 370/Radeon 890M), which has 50 AI TOPS instead of 10, will be able to do this much quicker still.

Like

Timo S.
Timo S.
Oct 24, 2024

I think your judgement on AMD iGPUs is a bit pessimistic or harsh. It all depends on your use case and raw files. I have an AMD Ryzen 8700G with an integrated Radeon 780M. Lightroom AI denoise on my raw photos (up to 26MP) takes roughly half a minute with the iGPU. But it really depends on your cameras or raw files. Here are some numbers for performing AI denoise on sample raw files from my various cameras:


Fuji X-T3 (26MP): 33s


Fuji X-E2s (16MP): 22s


Fuji X100 (12MP): 20s


Nikon D7000 (16MP): 28s


And for reference, a high megapixel camera (I don't have one, so I just used a sample DNG file from the web for the test):


Leica…


Like
Stephen Knight
Dec 15, 2024
Replying to

That's really useful feedback and certainly better than expected based on other benchmark testing. I'll update the article. I welcome testing by other readers on other iGPUs as well.

Like
  • Stephen Knight Instagram
  • Stephen Knight Facebook
  • Threads

© 2021-2025 Stephen Knight. All Rights Reserved.

bottom of page