Viewing at 200%

Sort of, but it will stop them getting lots of complaints for blurry images. Which is why I suspect it's really there.

It's way too generic. Size of print will matter, size of the image too. If your 40MP full frame image is slightly blurred at 200% but you're printing it to a size of 2" x 3" you'll never see it. If you follow their advice, you would probably reject lots of images that you might not need to. But it guarantees that none of them will be blurred in the final revision - which means no calls into their care line.
 
But does anything above 100% not give a false picture of the image? I get jaggies with full size RAW - at least on some lines
 
Last edited:
I've not read the article, but if you have a high resolution screen, especially if it's small like my 15" QHD laptop, then viewing at 200% - even 400% - is the only way to know if a picture is sharp.
 
But does anything above 100% not give a false picture of the image? I get jaggies with full size RAW - at least on some lines

That's my point. It depends on the original image size.

Your eyes can only discern a certain amount of pixels per inch. Anything more becomes indiscernable. Anything less and you potentially start to see the pixels or their shape. Assuming 20-20 vision, you can't resolve better than 300ppi at 12", 115ppi at 30" and 50ppi at 6 feet (TV distance). With Blurb, and thus a book, one could assume 300ppi (a 12" reading distance). So if you take your image dimensions in pixels and divide by 300, that's the physical inch size it can be and you won't see anything you can't already see at a normal viewing size.

Jaggies (if i understand right) are artifacts generally down the edges of contrast in an image and are often seen when zooming right in. Lets assume you have an older camera, or a crop, (or both) and your image is 1800 x 1200 px. Dividing that by 300 gives me 6x4. As long as I print to 6x4 or smaller, it will look like it does on the screen - especially if I can arange the image on the screen to present an image that is 6x4 inches in physical size. Even if I slap it into their biggest book (12"x12") it will be 150ppi which will look fine to my (old, non 20/20) eyes.

Viewing at 100% or 200% or 500% doesn't really help. You need an idea of what size it's going to be displayed at to understand how "miniaturised" the imperfections will be. Instagram for example is an online only thing. On my phone, the image size will be roughly 2" square. On my desktop it's about 5" square. The quality is about the same because the phone is often closer to my face than my monitor.

I think this is exactly where the information is unhelpful. People who know no better will be examinig their digital SLR files and seeing imperfections at 200% but there's no way they'll see them in print. However there will be people trying to print heavy crops from a phone camera which will also look bad and they will be seen in print.

It's generic information designed to prevent complaints. It's similar to how print companies request 300ppi no matter what size you're trying to print to.
 
Thank you and I'm going to ask a really stupid question because I'm not sure I've understood.

I always thought viewing at 100% means 1 image pixel = one screen dot. If you view at that scale you see things as accurately as you can without any pixel information being lost to fit the screen.

If you view at less, information is lost, but if you view at more false information is added by software and that's why I thought it was always an inaccurate way to view images.

I also thought jaggies occurred either because there are too few pixels. In the case of zooming above 100% I thought the spacing out of pixels across more screen dots caused them. I'm curious to know why these are more obvious on my newly acquired 5Ds than on my Fujis.

I might need to revisit this when I'm less befuddled.
 
Back in the old days when illustrations for print were drawn by hand they were drawn around twice as big as they'd appear in print because reducing the size makes them appear sharper.

The same principle applies to digital photos. If you display a blurry pic small enough it'll appear sharp. That's why a lot of phone snaps look great on the phone but s***e on a computer screen when they are enlarged/shown full size.
 
I always thought viewing at 100% means 1 image pixel = one screen dot.
That's what I have always taken it to mean, however as Toni says above, on a smallish screen your eyes might not be able to see the individual pixels. As I type this the text looks smooth unless I get a magnifying glass out.
 
I always thought viewing at 100% means 1 image pixel = one screen dot. If you view at that scale you see things as accurately as you can without any pixel information being lost to fit the screen.
If you view at less, information is lost, but if you view at more false information is added by software and that's why I thought it was always an inaccurate way to view images.
It can... but it really depends on the monitor resolution (native), the display resolution setting (if lower), and the program's use of it (i.e. LR in low resolution mode). If you view it at a larger size to where the square image pixels become visible on screen, that isn't "false information"; it's just that you have fewer image pixels than screen pixels.

With film how large you could print was limited by the size of the negative, quality of the negative (sharpness/recorded resolution), and film grain... Nothing has really changed with digital; it is size of the sensor, quality of the image (recorded resolution, not MP's), and noise.

There's really no reason you can't use your monitor to determine optimal/acceptable print size/resolution; but it's not as simple as just viewing it at 200%.
 
I believe my issue with jaggies might be to do with the fact that I've been viewing on a different screen and the resolution is different from the one I usually used. I'm going to have a look at these jaggies on my other screen once I'm back on my feet. Thanks for your patience.
 
Last edited:
Assuming 20-20 vision, you can't resolve better than 300ppi at 12", 115ppi at 30" and 50ppi at 6 feet (TV distance).
Most people can't really discern 300ppi/dpi in an image; even with 20/20 vision aided with a low power loupe. Anything above ~ 150ppi will deliver excellent results for any use other than viewing with a loupe. Somewhere below 100ppi (~ 50ppi) is when an image typically begins to pixelate and you run out of image resolution.
I have to run my 15" retina screen with pixel doubling (102ppi); because at native resolution I can't see things clearly enough to discern, even at ~20" distance with corrected vision. I also run LR/PS in low resolution mode so that they use the display setting rather than native resolution.
 
Last edited:
If you check images at twice your intended output resolution, and it still looks good, then it hardly matters what the pixels might look like. at 200%
It will still satisfy people who get in closer than the intended viewing distance.

When looking at images At an Exhibition I usually look at them at three distances .. From a fair way away, to get a first impression. then at normal viewing distance, and if details are important then at close up, but no closer than I might call a hand held print distance.

There is absolutely no need to get all technical about it.
The impression of sharpness and detail is more important that true resolution. but more important than both is tonality.
 
I believe that's clearly right, Terry. I've been in bed a bit this week and had a lot of unfilled time on my hands (but not a lot of mental clarity) and when I came across the Blurb thing I was surprised. Out of curiosity, I looked at mine at 200% ( not on my usual screen ) and I noticed some lines had severe jaggies and this made me query the validity of the 200% idea, hence my post. I was a bit surprised how jaggy some of my Iines were though!
 
Last edited:
Back
Top