Best computers for photography...

Another firm worth a look is Power Computing, Bedford. Have used for my last two machines and have also bought for work from them, they will make to any spec and unlike some companies are very quick on delivery. My current machine I ordered on a Monday afternoon and picked up on Tuesday.
 
That is no longer the case! CS4 and beyond will use the processing power of the graphics card to speed up floating point calculations.

Got a link to a PR of that?
I doubt it'll be able to use any graphics card. CUDA capable nVidia GPUs maybe, perhaps some newer ATI cards as well, but I wouldn't bet on it.
 
Looks like this guy's actually working for Adobe:
http://blogs.adobe.com/jnack/2008/05/oct_1.html
I didn't say anything about schedule. In fact, I never said that any of this stuff is promised to go into any particular version of Photoshop. Rather, as with previous installments, it's a technology demonstration of some things we've got cooking--nothing more.

For those who don't have the foggiest about it, GPU programming isn't for any graphics card of any brand. Look at http://www.gpgpu.org/ for more details.
 
As the vast majority of people are using a small subset of graphics cards it is quite likely that Adobe will do different modules for each one within their software, just like they will have routines optimised for some processor functions. There has been a lot of buzz about this so I simply mentioned it. If it is in CS4/CS next then fine, if not fine. We'll see then! There has been some use of general purpose graphics cards by programmers already such as folding at home but we'll see what comes about. Oh and I said the graphics card, not ANY graphics card. If you are a serious CS3 user upgrading to CS4 you will most likely have a new ATI or NVIDIA card in your system anyway. For those that don't it would no doubt not work.

After typing this I came across this article, it explains the whole use of GPUs for calculation thing in a more simple manner:

http://www.tomshardware.com/reviews/nvidia-cuda-gpu,1954.html

But like I said earlier I wasn't referring to the system using ANY GPU. So long as it can make use of modern ATI and NVIDIA cards then most graphics professionals will get a benefit. People would also change their card to get a substantial increase.
 
If its dedicated photo work you want to focus on and are going to use something like PhotoShop then I would say a quad core CPU and a large amount of RAM will be the most beneficial to you. Prices these days for either are very cheap, if you bought just the CPU and RAM on their own it wouldnt cost you more than £200.

I build my own systems, but I think you would be happy with a Dell, they even sometimes have some great deals in their Outlet section.

As others said, a good monitor is probably the most important thing, and again the 24"+ Dells are quite good.

As for CS3 and gfx cards I did notice this in it but it doesn't appear to do a whole lot even though it suggests it will use it.

CS3.JPG
 
CS3 on the mac goes as far as asking whether to enable it but i'm not sure what use it makes of it either !

psperf.png
 
I'm going to give a tentative vote for Dells here. I use a Mac for image editing, but I've been *hammering* my Inspiron 6000 for four years now, and it's still a good bit of kit.
 
Also consider two HDD's (or even two RAID 0 arrays) one for the system and one for your scratch disk/storage. A setup like that should do you well for years to come.

Wouldn't touch Raid 0 if i was you. Hard disks do fail, and most people don't have backups of their entire hard disk. If you are using Raid 0 then the level of fault tolerance is 0. If either of the hard disk fails then you lose everything. Using the figures based on that page you linked to, it doubles the possibility of failure.

A better option (with three disks or more) is raid 5. You get the performance increase and also the resilience of a tru array, although you may need a different controller.

I use Raid 1 (mirroring) for my system volume, and raid 5 for the data.
 
Good link, cowasaki, one of the comments mentions OpenCL, which looks interesting, kind of like nVidia's CUDA in a way (it should be based on C): http://en.wikipedia.org/wiki/OpenCL
Radeon X1900 had one of the first GPUs that can be used for some general purpose computation, as far as I know. One of the folding@home clients could use it for some operations. Then again, it could be used for just better font rendering or stuff like that.
 
Back
Top