Graphics card

Messages
232
Name
Steve
Edit My Images
Yes
#1
Evening folks. I am looking to upgrade my 6 year old graphics card for photo editing - any recommendations please? (might be able to use PS then :)
Also, a decent monitor for graphics too.
Appreciate your time and thanks.
 
Messages
7,734
Edit My Images
No
#2
Last edited:
Messages
7,734
Edit My Images
No
#5
I would now go for a GTX 1650 is it's more-or-less the replacement for the GTX 1050 Ti. A lot more performance for the same power consumption and roughly the same price.
Had a quick look the specs and of course/indeed the 1060 has more including it seems 6GB vs 4GB.

But the one thing to be aware of is the power demand:-
Gtx1050 is 75W just from the PCI socket
Gtx1060 is 120W and it has a supplemental power connector as the socket power is not sufficient.

Plus, check the specs of the card builders! They don't all necessarily implement all the nVidia spec features such as leaving off the DVI connector and possibly not meeting the higher screen resolutions as per the nVidia design specifications. AFAIK some of those sorts of 'cut down' implementations mean they can reduce to power demand and dispense with the supplemental power connector?
 
Messages
6,309
Name
Bazza
Edit My Images
No
#6
I did the exact same upgrade from a GTX 560Ti I think it was to aGeforce GTX1050 Ti version. Also note I have a 750 watt PU.
First of all why did I do he upgrade? because playback on my dashcam "stuttered" and not a smooth playing thoughout the video. The new graphics card sorted that out Make sure it is the Ti version

As for monitor I have a Dell IPS monitor which suits me
 
Last edited:
Messages
2,728
Name
Mark
Edit My Images
Yes
#7
Had a quick look the specs and of course/indeed the 1060 has more including it seems 6GB vs 4GB.

But the one thing to be aware of is the power demand:-
Gtx1050 is 75W just from the PCI socket
Gtx1060 is 120W and it has a supplemental power connector as the socket power is not sufficient.

Plus, check the specs of the card builders! They don't all necessarily implement all the nVidia spec features such as leaving off the DVI connector and possibly not meeting the higher screen resolutions as per the nVidia design specifications. AFAIK some of those sorts of 'cut down' implementations mean they can reduce to power demand and dispense with the supplemental power connector?
Snapshot said Gtx1650 not gtx1060
 
Messages
7,734
Edit My Images
No
#8
Snapshot said Gtx1650 not gtx1060
Drat! my bad :coat: :exit:

PS though the 1650 has dropped the Display Port connection, only(?) HDMI and DVI . FWIW I also bought my 1050Ti because it has DP, HDMI and for me at the moment DVI......giving me the potential to multi screen if I can justify the additional monitor sooner rather than later, on both space or lack of it & the extra spend???
 
Last edited:
Messages
2,728
Name
Mark
Edit My Images
Yes
#9
Drat! my bad :coat: :exit:

PS though the 1650 has dropped the Display Port connection, only(?) HDMI and DVI . FWIW I also bought my 1050Ti because it has DP, HDMI and for me at the moment DVI......giving me the potential to multi screen if I can justify the additional monitor sooner rather than later, on both space or lack of it & the extra spend???
The 1650 or 1050(ti) refers only to the graphics chip that is made by Nvidia and doesn’t dictate what connection ports the card has, It’s a bit like an engine in a car, which of course doesn’t dictate how many seatbelts the car has.

Each card manufacturer (which can include Nvidia) makes a card from a pcb, graphics chip and vram and then adds what ports it wants to suit market demands.
 
Last edited:
Messages
2,119
Name
Jonathan
Edit My Images
No
#10
Drat! my bad :coat: :exit:

PS though the 1650 has dropped the Display Port connection, only(?) HDMI and DVI . FWIW I also bought my 1050Ti because it has DP, HDMI and for me at the moment DVI......giving me the potential to multi screen if I can justify the additional monitor sooner rather than later, on both space or lack of it & the extra spend???
I see more 1650s that have dropped DVI than have dropped DP. How big was the sample you checked?
 
Messages
2,119
Name
Jonathan
Edit My Images
No
#11
I did the exact same upgrade from a GTX 560Ti I think it was to aGeforce GTX1050 Ti version. Also note I have a 750 watt PU.
First of all why did I do he upgrade? because playback on my dashcam "stuttered" and not a smooth playing thoughout the video. The new graphics card sorted that out Make sure it is the Ti version

As for monitor I have a Dell IPS monitor which suits me
I found the stuttering was more down to the playback software. Whatever MS uses stuttered, coming to a complete halt near the end of the file even with a GTX 1050 Ti, but VLC had no problems at all.
 
Messages
1,702
Name
Brian
Edit My Images
Yes
#12
If all you want it for is 2D photo editing then avoid those expensive 3D accelerated graphics cards - they are a waste of money (and power.)
2D photo editing is not very demanding on a graphics driver, as long as you meet the editing programs graphics recommendations in terms of Open GL version, recommended Memory size etc.
Whatever card you select always make sure it is kept updated with the latest driver software.
I find that AMD seem to update their Radeon software almost weekly recently.
 
Last edited:
OP
OP
kentbirder
Messages
232
Name
Steve
Edit My Images
Yes
#13
I don't play games on the PC, so a good card for photo editing would suffice. For PS, I think the minimum is 2 GB.
 
Messages
2,119
Name
Jonathan
Edit My Images
No
#14
If all you want it for is 2D photo editing then avoid those expensive 3D accelerated graphics cards - they are a waste of money (and power.)
2D photo editing is not very demanding on a graphics driver, as long as you meet the editing programs graphics recommendations in terms of Open GL version, recommended Memory size etc.
Whatever card you select always make sure it is kept updated with the latest driver software.
I find that AMD seem to update their Radeon software almost weekly recently.
Recommending AMD cards to save power is not exactly helpful as, in general, they draw considerably more than the equivalent Nvidia card.
 
Last edited:
Messages
7,585
Name
Stuart
Edit My Images
Yes
#15
As you don't play games and it's for photo editing only I personally wouldn't entertain anything that requires additional power from your PSU.

A GPU that draws its power from the PCI-E slot will be sufficient.

You may even get away with something like a Nvidia GT 1030.

The only time I would look at anything beefier is if you were using specific filters that can utilise the extra benefits of a GPU. Only then would I look at at GTX 1050Ti and above.
 
Messages
1,702
Name
Brian
Edit My Images
Yes
#16
Recommending AMD cards to save power is not exactly helpful as, in general, they draw considerably more than the equivalent Nvidia card.
I can see nowhere I recommended an AMD video card in my post (even though I use one without problems.)
Previous nvidia cards I have used required an additional power connector from the PSU. How does that stand as far as power consumption?
 
Messages
7,734
Edit My Images
No
#17
As you don't play games and it's for photo editing only I personally wouldn't entertain anything that requires additional power from your PSU.

A GPU that draws its power from the PCI-E slot will be sufficient.

You may even get away with something like a Nvidia GT 1030.

The only time I would look at anything beefier is if you were using specific filters that can utilise the extra benefits of a GPU. Only then would I look at at GTX 1050Ti and above.
I think @stupar last paragraph is most relevant ~ as in, just how often do "we" update our hardware compared to software. As the software is improving (?) there is greater use made of the GPU processing power & it's onboard RAM ...... so a bit like a carpenter, think "measure twice and cut once....to avoid expensive mistakes...."
 
Messages
545
Name
Paul
Edit My Images
Yes
#18
A gtx 970 would be more than enough for Photography work, cheap as chips as the gamers tend to offload these and not to power hungry, and the good ones come with a decent set of outputs.
 
Messages
7,585
Name
Stuart
Edit My Images
Yes
#20
In light of all the recommendations above there are some important questions to ask -

Firstly what is your current power supply wattage and does it also have provision for 6/8 pin PCIE output.

If the answer to the above is 450 Watts or less and no then you will be better off with a GPU that only draws power from the PCI-E slot. (unless you want to spend money upgrading your PSU too)

It's worth reiterating that not all cards are equal.
Take for example the recommendation above of the GTX 970.
The nvidia version recommends 500w minimum m and 2 x6 pin connectors.
Switch to a Gigabyte windforce edition of the GTX 970 and all of a sudden it wants 550w minimum and 1 x 6 pin and 1 x 8 pin connectors.

What is the rest of your system specs as I note you say with a new GPU you might be able to run PS. PS has the ability to run from inboard graphics.
 
Messages
2,538
Edit My Images
Yes
#21
I was recommended to purchase a heat-sink option rather than a fan, purely on reducing noise I think. What are peoples opinions on that?
 
Last edited:
Messages
7,585
Name
Stuart
Edit My Images
Yes
#22
I was recommended to purchase a heat-sink option rather than a fan, purely on reducing noise I think. What are peoples opinions on that?
Passive cooled GPUs are fine if they aren't going to be under massive strain.
Equally GPUs with fans aren't that noisy unless at full chat so it's a 6 and two threes.

My Gigabyte RX580 has fan stop technology so at idle and light duties the fans don't spin at all.
Though only kick in for medium to heavy tasks and even then they aren't that noisy.

As a side note, traditionally blower GPUs are known to be louder than fan GPUs
 
Messages
2,119
Name
Jonathan
Edit My Images
No
#23
I can see nowhere I recommended an AMD video card in my post (even though I use one without problems.)
Previous nvidia cards I have used required an additional power connector from the PSU. How does that stand as far as power consumption?
The GTX 1650 I recommended only draws a maximum of 75W so doesn't need an extra power connector unless it's a factory overclocked version. The same is true for the GTX 1050 Ti.
 
Top