Need a GPU for Lightroom 5.7 photo editing on a 27 incher monitor and 4 year old rig

Messages
1,922
Name
Phil Marion
Edit My Images
No
Impulsively I went online all of last night and in to the early morning and in a sleepy mindset I pulled the trigger on a BenQ SW2700PT.

When I awoke this morning I realized that I'd neglected to consider whether my PC can handle it. My expertise in pixels, not bits and bytes.

I don't have a graphics card - I use the onboard Intel® Z77 chipset graphics: Intel HD 4000 on my Ivy Bridge i7 3770K
My mobo is the ASUS P8Z77-V PREMIUM
From what I've subsequently read the new monitor's increased resolution and resultant complex calculations will tax my CPU. I'm wondering (hoping) that a dedicated GPU will help.

I do NOT game or edit video. I do only light netflix viewing but the biggest demands on my system are RAW photo editing. Moving pixels around and displaying it on a high res monitor involves a lot of CPU calculations.
Will a GPU help my pc performance?
My understanding is that Lightroom 5.7 does not make use of GPU acceleration. A new LR 6.0 version does and this may force an upgrade for me.
I was hoping for a complete new rebuild in 2 or 3 years. I hope it isn't required as a result of the new monitor purchase.
Any recommendations on the best value (bang for buck) GPU that could help me (and ideally be used in my next build)?
I run W10 64 bit, 32 GB RAM, 1 SSD and 2 spin HDs and Antec 620 W PSU.
Sincerest thanks for any helpful suggestions!
 
should be fine with the CPU graphics processor for 2D work. that CPU supports up to 3 displays so 1 at 1440 should be absolutely capable.

even with the option for GPU acceleration a lot of people have found lightroom works better with it disabled.
 
Last edited:
Thanks I hope so - we'll see soon enough. But I've seen a few threads on other sites where people have seen system slow downs on LR with a bigger monitor. I suspect it is the CPU now being bogged down with moving more pixels on screen.
We'll see if mine is up to the task.

Oh and I was hoping to use the old NEC as a second monitor...I won't if it taxes the CPU. As I am planning a new system in 2-3 years time maybe I'll overclock the CPU.
Thanks Neil!
 
As said, GPU acceleration in LR is more or less worthless, I have it switched off on all my computers.

As long as the GPU can drive the monitor (which it can) then all will be fine.
 
Thanks...would a new dedicated GPU card help things out?
I have no qualms spending a few dollars (I'm Canadian) if it will. I have a ton of photos to process and load up on to Getty images so it will pay for itself if it improves/quickens my workflow.
 
Note that you'll need to use DisplayPort to get the full resolution from that monitor with your motherboard.
Best value new card at the moment is probably the GTX1050. It should just drop in to your motherboard, W10 will recognise it and download suitable drivers. I'm currently using one at the same resolution with my i7-3770 under W7 Pro.
 
Thanks...would a new dedicated GPU card help things out?
I have no qualms spending a few dollars (I'm Canadian) if it will. I have a ton of photos to process and load up on to Getty images so it will pay for itself if it improves/quickens my workflow.


All the GPU does is accelerate the develop module (moving sliders etc) and my Macbook with a crappy mobile processor can handle this pretty well so your CPU will crush it with ease. The things that take time such as exports, preview building etc are done by the CPU so a GPU won't really help at all. I don't use Photoshop much but I believe that is also mainly CPU driven too.

It might be nice to have a graphics card for other reasons (such as connectivity, as mentioned above) but it won't speed up your processing.
 
Last edited:
All the GPU does is accelerate the develop module (moving sliders etc) and my Macbook with a crappy mobile processor can handle this pretty well so your CPU will crush it with ease.
I realise the new screen will have no impact on the exporting or rendering of RAW files to jpg. It is the displaying of the photo in LR when I make changes.
CURRENTLY it has no problem in develop mode with my 20 inch monitor. But doesn't that change if I now use a 27 inch higher def screen?
My mobo has a DisplayPort with max. resolution 2560 x 1600 @ 60 Hz and a Thunderbolt with max. resolution 2560 x 1600 @ 60 H.
The monitor includes a DP to mini DP Cable but NO DP to DP cable. So miniDP is Thunderbolt? My mobo is one of the few pc boards with a Thunderbolt port, So I don't have to purchase a DP to DP cord?
 
I realize the following site is a commercial site devoted to selling you hardware but their advice is interesting.
https://www.pugetsystems.com/recomm...-Adobe-Lightroom-141/Hardware-Recommendations

Video Card (GPU)

In Lightroom CC 2015 and Lightroom 6, the software is able to utilize the power of your GPU to improve performance when editing images in the Develop module. At the moment, the performance gains are fairly modest, although Adobe has been investing heavily in GPU acceleration. While a high-end GPU is not required to get the benefits of GPU acceleration in Lightroom, it may be a good idea to get a slightly faster GPU than you think you need to help future proof your system.


Lightroom is also very light on VRAM requirements, so even a card with just 2GB of VRAM should be more than enough. However, if you work with large images in Photoshop or use a 4K monitor it is a good idea to use a card that has at least 4GB of VRAM if possible. Workstation video cards are not required for Lightroom, although if you will be using a 30-bit monitor you will need a NVIDIA Quadro video card as GeForce cards currently do not support 30-bit display output.


Although it is likely that Adobe will increase GPU acceleration support in Lightroom in the future, the current demand on the video card is actually relatively light. We recommend either a GeForce GTX 1060 GeForce GTX 1070.


Is my BenQ SW2700PT a 14 bit monitor?
 
i think you need a newer lightroom, or capture one to really take advantage of a "proper" gfx card (maybe other raw devlopers too)
and for capture one, amd cards where better, think they have tend to have better general compute performance than the equivilent nvidia card, and you do need todo some research before you buy as normally the lower end models are quite abit worse than the upper half. if the second digit in the name is a 7 or above, then its a decent card, 3 or 4? garbage normally.

and overclocking you can probably get 4.5ghz or around that area
 
Even with my 3.3Ghz i5 5k iMac which has a reasonably decent Radeon R9 M395 in it I have GOU acceleration switched off. Maybe one day I'll try it again but in your situation I really wouldn't worry.

But, as has been said, it might be nice to get a card for connectivity reasons. (although I think it will work as it is).
 
I realise the new screen will have no impact on the exporting or rendering of RAW files to jpg. It is the displaying of the photo in LR when I make changes.
CURRENTLY it has no problem in develop mode with my 20 inch monitor. But doesn't that change if I now use a 27 inch higher def screen?
My mobo has a DisplayPort with max. resolution 2560 x 1600 @ 60 Hz and a Thunderbolt with max. resolution 2560 x 1600 @ 60 H.
The monitor includes a DP to mini DP Cable but NO DP to DP cable. So miniDP is Thunderbolt? My mobo is one of the few pc boards with a Thunderbolt port, So I don't have to purchase a DP to DP cord?
No, it's not even related. You will need a DP-DP cable.
 
Any thoughts on why the monitor does NOT ship with a DP to DP port cable but DOES ship with a Mini DP to DP port cable?
 
OK, so I'm sort-of wrong.....
I don't know as every DP-equipped device here uses the full-size connector.
 
Received the monitor today and setting up was a breeze. Base, arm and screen easy to assemble.

Had absolutely NO problems running it on my 4 year old processor's HD4000 graphics - even have the NEC 20WMGX2 as a 2nd monitor with both running at their native resolutions.

The BenQ is at 2560x1440 - w10 drivers were installed but not as easily as the manual said it would be.

Good news: It is connected using the supplied mini DP to DP wire: the Mini DP plug in my Intel mobo's Thunderbird port and the DP plug in to the monitor. Never thought that I'd ever use that port but am so glad I didn't have to go out and waste an hour buying another cable. Still don't know why they included it and not a DP to DP cable.

I am happy to say that I see almost NO performance degradation with LR 5.7.

Small difference: when in Library mode and scrolling in Grid view is a bit choppy.

Small difference: Bit more time required to cycle between regular and zoom when in Loupe view.

Performance differences are negligible - to the point I regret the considerable time I wasted researching it. Guess my i7 3770K and mobo were good choices 3 years ago...and happy I won't need a GPU!!

Though it is early days thus far...I'll report any issues.
 
Did a system upgrade. Monitor is playing very well with LR now. At time of first use, it was slowing down when I was using crop tool - this forces LR to change zoom to fit in to screen. As a result, I added a second SSD as the LR lrdata previews file (> 100 GBs) had almost filled my SSD C boot drive. Also I swapped out a spin hard drive that was dying with a new one. The problems disappeared completely.

I'm not a techie, but I suspect it had to do with the windows page file being on the 'soon to be toast' data dick drive as well as the fact performance was degraded by my boot drive being near capacity. I've moved the LR catalogue and all the lrdata/lrcat files on to a new secondary SSD drive.

One or both of the actions above has resulted in a performance boost. My i7 3770K and intel onboard 4000 graphics are now running LR on the BenQ with NO appreciable perceptible difference when compared to the smaller 20 inch monitor I used previously - and have hooked up as a second monitor. Also I calibrated the BenQ with the BenQ software which adds the profile to the monitor's LUT (just reporting what I read - can't say I understand what that means). Someone suggested that this could also take some of the burden off of the CPU.

The only difference I see now is when I scroll quickly between files in the Library mode. They take longer to come in to sharp focus that they did previously.

I did all these changes at the same time so i can't say with any certainty which of the 3 things contributed most to my performance boost.
 
I just delete my preview file when it gets too big.
 
Note that you'll need to use DisplayPort to get the full resolution from that monitor with your motherboard.
Best value new card at the moment is probably the GTX1050. It should just drop in to your motherboard, W10 will recognise it and download suitable drivers. I'm currently using one at the same resolution with my i7-3770 under W7 Pro.

How does that card stack up against a RX 460?
 
This isn't personal experience as I haven't used any recent AMD GPUs (Nvidia is much better for Folding) but www.videocardbenchmark.net scores the RX 460 at 4224 and the GTX 1050 at 5046. The RX 460 is cheaper, however, enough that that web site gives it fractionally better value. Nvidia cards tend to use less power than AMD when idling so might be marginally cheaper to run. Just pick your favourite brand and go with it.....
 
Perspex side panel and loads of LEDS >.>

Slam it, dustbin exhaust, and tint the perspex so you can't see through it.

:jaffa:

edit: Don't forget to hand paint racing stripes with green hammerite - this really picks up the pace. Lads on the underground gaming scene reckon its the new overclock.
 
Last edited:
[QUOTE="Snapsh0t, post: 7730848, member: 21734" The RX 460 is cheaper, however, enough that that web site gives it fractionally better value. Nvidia cards tend to use less power than AMD when idling so might be marginally cheaper to run. Just pick your favourite brand and go with it.....[/QUOTE]
I went with cheaper as I suspect my use won't tax the GPU at all. i don't game. Just LR and websurfing. I liked the fact it was plug and play and uses less power. Easy to install and it has helped my system a bit. I also OC'ed my CPU using the ASUS BIOS 'optimal mode". So I cannot tell if the system is faster because of the GPU or the fact my 3770K is now spinning at 4.1 GHz.
Either way I am happy.

Thanks everyone!!!
 
Back
Top