Do graphics cards really help Lightroom or just increase the energy bill?

Messages
1,374
Edit My Images
Yes
I'm processing a large even shoot atm 400+ photos and wondering if using a dedicated graphics card would improve my processing time?
I'm using the latest version of LR classic with an i7 processor,32GB ram and SSD drive for LR installation and where RWA files are housed but only the onbord Intel HD graphics
On individual photos the difference may well be small but if you multiply that saving by 4 or 500 photos it could be worthwhile

If I was to add a card would a £50 2nd hand 4gb option be enough?

I'm not a gamer so only interested in LR benefits

Many thanks,
Mike.
 
Last edited:
I have no issues with this one

NVIDIA GeForce GTX 1050 Ti
 
Last edited:
I have used a GTX 1050 for a few years and was keen to stick with it as it was a version without a fan,. any newer cards all seam to have fans. I have just swapped it for a GTX3050 and haven't really noticed a significant difference in performance. Its's possible I don't have enough processor performance to make use of it though.

Andrew HATFIELD | Architectural and Interior Photographer
 
Massive massive massive difference on desktop using 3060TI + 32GB vs 2023 Samsung i7 laptop with integrated junk. denoise runs in a few seconds vs minutes, so that much. And this is only going to grow from now on.
 
Well finally got around to fitting a NVIDIA GeForce GTX 1050 Ti 4GB. If i'm honest i'm not blown away with any noticeable performance improvements. Without scientific testing the only noticeable improvement i can see is on exports where it takes about 60% of the time before the card was installed. As i batch export this is probably the least important area to me.

I was hoping for much quick load times for the crop tool and sequencing between images, both of which are still fairly slow. The upright transform seems a bit quicker but overall it's marginal. What aspect of lightroom would people expect to benefit for a reasonably ok graphics card ?
 
The new PC from PCSpecialist I bought late last year is is fitted with a 12GB NVIDIA GEFORCE RTX 4070 and the difference compared to my old machine is like night and day. The graphics operations in Lightroom which really need GPU memory now take a few seconds rather than minutes and because of this, get used far more frequently.

Anthony
 
I'm more with @mikeyw. I replaced a 6 year old computer and noticed no improvement in performance. There probably was an improvement, it just wasn't significant enough to notice.
 
I'm more with @mikeyw. I replaced a 6 year old computer and noticed no improvement in performance. There probably was an improvement, it just wasn't significant enough to notice.
I often wonder if we are encouraged to spend money on 'upgrades' for a marginal gains. Yes if you throw a fortune at it you'll get gains but for moderate investment the benefits are just that. I'll only truly know once i edit a large job and see if anything is better.
 
They do help, and no they don't increase your bills much at all if they don't do heavy work

Desktop uses a lot more power by default than laptop. If that's a concern perhaps best to go with highest model of MacBook but that will put your expenses up considerably upfront and you will still need a power hungry monitor
 
I have a graphics card built into my laptop. Graphics cards I thought were for gaming where the difference is supposed to be huge compared to no graphics card. I am not a gamer, BUT when I ran the star-gazer app 'Stellarium' on my laptop the difference was immediately apparent. Screen-refresh was smooth on the graphics card but glitchy when not and also the temperature of the CPU when not on the graphics card was up at 91⁰C yet when on the graphics card everything was around 40⁰C so I concluded that the software detects the graphics card (or you might to tell it the card is there) so everything runs smoother faster and cooler, even if you can't actually see it physically.
 
Back
Top