What do we see when we look at images at 100%

Messages
7,901
Name
Terry
Edit My Images
Yes
I have been thinking recently about what we see actually see when we examine digital Images.

This was brought to my attention when a recent online article used 300% images to show the difference between various digital camera outputs.

Now this immediately got me thinking back to the days I used to make giant 3 meter square enlargements for window displays. It did not matter what size film was used, as we enlarged them from anything between 13x18 cm cut film and 6x6 cm roll film. At whatever negative size we used, they were all grossly over enlarged. However from the outside of the window they all looked very good indeed. When examined close up it was easy to see the effect of the grain structure. This structure change as to the emulsion and developer used. The sharpest grain was produced by Adox R14 developed in Neofin blue, where even the edge effect of the compensating developer was apparent. While 100 asa cut film developed in D76 type developer produced a much softer granular effect.

But in all cases, the only actual artefacts were the high contrast edge sharpening. At such magnifications the detail simply dissolved into nothingness slowly but surely.
This is quite different to looking at an over enlarged digital image which is made up entirely of false detail.

Now that claim needs explaining.
The image produced by a sensor is a complex mixture of light and shade (luminance) data provided by each pixel, and Colour information provided by a pattern of individual filtered pixels. These are usually either the Bayer pattern, or more recently the more complex Fuji Xtrans pattern. The exception being the Foveon which stacks its filters.

For this reason, the colour information that falls on each pixel site has to be approximated by use of algorithms, It does this from the data that is obtained from the pixels that do receive the actual filtered red Blue and Green data. This is in itself complex because there are twice as many green pixels in a bayer array than either red or blue ones.

Other factors also come into play, in that there are inevitably gaps between the individual pixels that block light and can not record it. There are also micro lenses that help to gather light falling obliquely onto the pixels. And very often a low pass filter that spreads the light over more than one pixel to attempt to reduce Moire patterns.

It is true to say that even a raw file, direct from the camera, is not recording the actual information of any pixel. The information we get to see, is entirely one that has been produced by algorithms. It has been produced entirely from artefacts. That is if you define artefacts as the result “interpolated” calculated data. This is not surprising if one considers that the only information available includes noise, partial coverage of the sensor surface, defocussed and filtered data obtained from a pattern of four or more Pixels.

At best, and examined at 100% Digital images contain far more approximated and false information
than direct information. At 100% Fine detail is made up of mostly fine artefacts. At 300% It simply magnifies those faults and approximations with out offering any further detail.

When we produce large prints from our digital images we introduce a further pattern of dots and a further set of algorithms, that actual diffuse and disguise many fine artefacts, and in this regard often look rather better than the same file seen on our screens, other things being equal.

I would suggest Most files look at their best, and can only look best on a screen, at around 25%.
and at that size even gross artefacts are largely invisible.
I would also suggest that 100% is only useful during Processing and 300% during operations like drawing a path or selection.
At less than around 50%, things like the artefacts that cause waxy skin and other false data effects become virtually invisible.
It is true that as the algorithms used in raw processors improve, so does the “Realism” of artefacts.
But this “realism” will never be a true representation.

We tend to Judge camera output at 100%, I maintain that this is incorrect, as at that magnification we are seeing the output quality of the raw or jpeg processing, rather than the potential difference between cameras and sensors.
It is interesting that when reprocessing old raw files from early digital cameras, in the latest versions of Photoshop or Lightroom, that images can be greatly enhanced compared to the earlier processing. Old cameras are often better than we thought they were.




 
I never understand what is meant by "100%". 100% of what? Surely that just means the whole image. I have the same problem with "100% crop" which means the whole image is cropped.
 
Its all down to viewing distance. You dont zoom 3x to view the image from 2 ft. Stand back far enough and any enlargement will look fine.
I agree the algorithms in camera fill in the gaps but the brain does rest. We see what we expect to see and combine all sorts of disparate elements to do so.
 
Well said Terry - and there is no doubt that film was, in some respects, better than digital images - not that I would ever want to go back to using films - digital photography is just so much more convenient!

But you could get excellent results from B/W films if you knew what you were doing.

I remember using Tri-X pan for weddings and over exposing it 2 stops then cutting the developing times in order to avoid the "soot and whitewash" effect which you got in bright sunlight when photographing the white bride's dress and the groom's black suit.

My technique got full detail everywhere, especially in the bride's dress, which really delighted them!:)

My cameras in those days were the Mamiya C2 twin lens reflex with interchangeable lenses and a Ricoh Singlex 35mm for a backup.

https://www.photo.net/discuss/threads/presenting-the-mamiyaflex-c2.439965/

My flash unit was a Mecablitz 502 with a lead-acid battery pack carried over the shoulder:

https://collection.maas.museum/object/131861

Those were the days! :LOL:

Thank God for digital photography!
 
I never understand what is meant by "100%". 100% of what? Surely that just means the whole image. I have the same problem with "100% crop" which means the whole image is cropped.

No, it means that a portion of a 100% image is cropped.
 
A 100% image is shown as an actual pixel for pixel image
A 100% crop is a crop out of an image shown at the 100% magnification size..
 
Last edited:
Ive recently dusted off my (long forgotten) Canon 1D mk1 (Yep, the original), still works beautifully and I was interested to see how the 4mp sensor holds up.
You know what ? Perfectly usable prints at 12x18 and for such an old sensor the colours, contrast etc are still superb. I have to admit I have been guilty of this pixel peeping malarkey and have probably spent money that wasnt necessary in the process. I came to my senses and sold my 5Dmk3 and have happily now gone back to (and loving) using my 1D mkIIn for sports and kids and my very battered 1Ds mkIII for everything else.
Good post Terry.
 
I have my screen set to give an approximation of what the photo would look like at A3 size.
Largest I need and if that looks good, it will do for me, just overly obsessive trying to find faults.
 
Pixel peeping is rarely productive when trying to judge an image. The only time I go to 100% or more is when I'm doing pixel level cloning.
 
It doesn't seem helpful to me to describe calculated values and averages as 'artefacts' and I'm not sure why your opening post comes over as so negative (pun not intended) Terry.

I can't imagine anyone looks at images at 100% for pleasure, but instead to check they didn't miss focus, to deal with noise, sharpening, control halos, and in some cases to confirm that the apparent resolution of the image is adequate.
 
Maybe I am being thick, but what does this thread actually say for most of us?
 
It doesn't seem helpful to me to describe calculated values and averages as 'artefacts' and I'm not sure why your opening post comes over as so negative (pun not intended) Terry.

I can't imagine anyone looks at images at 100% for pleasure, but instead to check they didn't miss focus, to deal with noise, sharpening, control halos, and in some cases to confirm that the apparent resolution of the image is adequate.

I disagree, from what I see and read it appears a good many people thoroughly enjoy looking at 100%.
Just this website has loads of examples, mainly looking for aberrations both real and imagined, mostly the latter.
 
I would suggest Most files look at their best, and can only look best on a screen, at around 25%. and at that size even gross artefacts are largely invisible. I would also suggest that 100% is only useful during Processing and 300% during operations like drawing a path or selection. At less than around 50%, things like the artefacts that cause waxy skin and other false data effects become virtually invisible. It is true that as the algorithms used in raw processors improve, so does the “Realism” of artefacts. But this “realism” will never be a true representation.

We tend to Judge camera output at 100%, I maintain that this is incorrect, as at that magnification we are seeing the output quality of the raw or jpeg processing, rather than the potential difference between cameras and sensors.
It is interesting that when reprocessing old raw files from early digital cameras, in the latest versions of Photoshop or Lightroom, that images can be greatly enhanced compared to the earlier processing. Old cameras are often better than we thought they were.

To be honest I often get a bit bored with the science and even with what's "best" and prefer to concentrate on what I like and what's good enough all things considered.

I crop quite a few pictures to 100% and I even take quite a few pictures intending to crop them heavily and maybe to 100% too. For example I often take flower / leaf / interesting thing pictures with non macro lenses because I haven't got one with me or because I want the perspective I can get and the result after the crop. Not that people flock to see my pictures but the ones who do look don't complain about the quality of the pictures that happen to be 100% crops. Because I do this I'll judge cameras and software by the result I can get at 100%.

BTW, I think that improving software is one good reason to shoot and keep raws. I used to use Raw shooter essentials and at the time I thought it was ok but after switching to CSx I've actually gone back and reprocessed quite a few pictures.
 
BTW, I think that improving software is one good reason to shoot and keep raws. I used to use Raw shooter essentials and at the time I thought it was ok but after switching to CSx I've actually gone back and reprocessed quite a few pictures.

But I think that improving software is an excellent reason to shoot JPEGs!:)
 
It's a great post Terry!

One to get the 'grey matter' working for sure!

Very difficult to form a meaningful reply though; but watching others with interest. (y)
 
Photography is about using light to make images.

Have a look at some of the worlds best images and see their flaws!
 
Photography is about using light to make images.

Have a look at some of the worlds best images and see their flaws!

Photography means different things to different people ;)

Some love the science of it
some love the craftsmanship of it
Some love the art of it
Some love all the above

............all love photography.
 
Last edited:
We all see a little differently. I have only ever prodced a single digital image from a camera made in the last decade* to be viewed as a 100% crop, and TBH it was pretty flat. One of the things I have come to like about higher resolution images from larger sensors is that they seem to hold a greater subtlety and sense of depth even when downsampled for viewing on a screen. Images at 100% seem to lose all that depth and interest for me.

*I have owned sub-megapixel cameras. They did not produce impressive results.
 
Photography means different things to different people ;)

Some love the science of it
some love the craftsmanship of it
Some love the art of it
Some love all the above

............all love photography.

I look at pictures of birds in flight, planes on the mac loop, semi clad models and zero DoF street shots of strangers and although I can appreciate them and the technique used to take them they mostly mean next to nothing to me and all I mostly want to do is take pictures of the things, people and places that mean something to me so that I can look at them later and feel a link to the person, place or thing and relive the moment. The kit is a part of it but I could just as easily drop it and use my Kodak Instamatic.
 
The OP makes a very interesting point that nobody seems to have picked up upon. And that is the fact, as the OP shows, that digital capture, captures essentially...nothing. The image information is for the most part interpolated or 'made up' from algorithms into ephemeral 0's and 1's that can be interpreted/processed in any way you like. There is therefor, in a very real sense, nothing 'real' about any digital image. Think about that. A photograph used to be proof of reality, of being there, of 'being'. That's because there was always a direct and physical light/chemical interaction that resulted in a physical and provable 'reality' of being there, of existence. The negative or positive. With digital, that trust has gone as people become more aware of 'deep fake's' etc. The connection between reality has been severed. Hence, in my view, this is one of the reasons people are moving back to film. Of course analogue images can be manipulated, but that manipulation can be easily and visually traced and fakery exposed (pun intended) unlike digital. Digital cameras do not photograph reality. They create their own reality and we agree to believe it.
 
The OP makes a very interesting point that nobody seems to have picked up upon. And that is the fact, as the OP shows, that digital capture, captures essentially...nothing. The image information is for the most part interpolated or 'made up' from algorithms into ephemeral 0's and 1's that can be interpreted/processed in any way you like. There is therefor, in a very real sense, nothing 'real' about any digital image. Think about that. A photograph used to be proof of reality, of being there, of 'being'. That's because there was always a direct and physical light/chemical interaction that resulted in a physical and provable 'reality' of being there, of existence. The negative or positive. With digital, that trust has gone as people become more aware of 'deep fake's' etc. The connection between reality has been severed. Hence, in my view, this is one of the reasons people are moving back to film. Of course analogue images can be manipulated, but that manipulation can be easily and visually traced and fakery exposed (pun intended) unlike digital. Digital cameras do not photograph reality. They create their own reality and we agree to believe it.

Are we heading towards foil hats?;)
 
Hence, in my view, this is one of the reasons people are moving back to film.

More people are moving on to phones than back to film.

Film is dead as a mass medium.
 
A photograph used to be proof of reality

It seems the retouching industry was a myth then. ;)

No malice intended BTW.

As someone who deals with lots of numbers generated digitally as a result of shining light through a bit of liquid and plastic, let me say that there is nothing ephemeral about 0s and 1s, whether they are used to produce a picture or whether they are used to tell if someone has a nasty virus. Those digits aren't magic and they don't come from some 'spirit realm', but occupy an actual physical space in a storage medium on your computer or camera memory card.

Film photography is a digital process too. Where you have a no light the silver halide remains un-changed - a 0. When a photon strikes a crystal and changes it you have a 1. The more light, the more 1s and the greater the image density. All we've done is to change the process from one using chemicals into another using electrons. Few photographers could explain how a photon interacts with the halide just as few could really explain how a photo-sensor works.

Film is nice, but don't fool yourself about the process being truly analogue - there is no seamless movement between zero density and absolute density, but rather it's just an early form of digitisation where we're often proud to show the digits.
 
Last edited:
Last edited:
It doesn't seem helpful to me to describe calculated values and averages as 'artefacts' and I'm not sure why your opening post comes over as so negative (pun not intended) Terry.

I can't imagine anyone looks at images at 100% for pleasure, but instead to check they didn't miss focus, to deal with noise, sharpening, control halos, and in some cases to confirm that the apparent resolution of the image is adequate.

An artifact is something made or constructed. It might be a near copy of reality but it is a construct never-the -less.
Unfortunately not all constructs are good copies.

While a group of four pixels might contain sufficient information to define a that group as a whole, a single pixel does not contain sufficient information to reconstruct it. So detail can not be accurately defined by individual pixels but by groups of pixels. Of course any given pixel will be part of the make up of four such groups of four, and an even greater number of groups of eight or other values. all of these possible groups and values are used in the algorithms used to create the final image.

However the greater the enlargement to 100% pixel level and beyond, the less "real" data is involved.
What can be achieved today is astonishingly good. But at pixel level it is still largely a chancy approximation, this varies from terrible to excellent.
Images always look better at sizes significantly smaller than the 100% pixel level as more data is available and approximations become more accurate.
 
A lot will come to intended final usage.

I work in advertising and from time to time we've had to get in some of the sheets from a 48-sheet poster (the big 2x1 billboards you see by the side of roads or across rail tracks).

The amount of detail in the shots is staggeringly low when seen up close. But because they're never designed to be viewed this way, it all just sort of works when you see them from a distance.

It's a bit like that painting of Myra Hindley that was made up of kids' hand prints. Up close, you could see the individual hands, but the painting was mounted and hung so that you first saw it from 200 or 300 feet away and could clearly see it was her.
 
A lot will come to intended final usage.

I work in advertising and from time to time we've had to get in some of the sheets from a 48-sheet poster (the big 2x1 billboards you see by the side of roads or across rail tracks).

The amount of detail in the shots is staggeringly low when seen up close. But because they're never designed to be viewed this way, it all just sort of works when you see them from a distance.

It's a bit like that painting of Myra Hindley that was made up of kids' hand prints. Up close, you could see the individual hands, but the painting was mounted and hung so that you first saw it from 200 or 300 feet away and could clearly see it was her.

The standard we used to set was that if the print of an image was excellent in the hand at 10X8, then it would enlarge to any size, to be seen at the appropriate viewing distance, that included 48- sheet posters.
That standard still is true of Digital images.

However huge digital images take on a rather nasty appearance when viewed close up when compared to an equivalent silver image.
 
Digital cameras do not photograph reality. They create their own reality and we agree to believe it.

Reality is just a matter of perspective. After all, if one observes the Universe at "100%", we would see that it is infact pixilated, since it comprises at its base layer of a matrix of "Planck lengths".

If the Universe is pixilated, could it be a pattern of design? Could our Universe be a simulation or a manipulation? Who knows. One would have to leave this universe (or reality) to have any evidence to support the argument either way. One would have to be observing "the sensor" rather than being a pixel on it.
 
@Terrywoodenpic I largely agree. I've been saying it for ages.. it's all about what you point it at.

That said, I'm not sure that you're assertion that a digital image is produced entirely from artefacts is a useful contribution.
The camera is storing a (filtered) sample of the available information.

I'll gloss over the details because it's a long time since I studied them in any depth and my knowledge is now sketchy.

That said, the Shannon-Nyquist sampling theorem applies to pretty much all digitization processes. That says that if your sampling rate is sufficiently high - and that rate is rather lower than you might expect - then you have all the information you need to perfectly reconstruct the original signal, free of artefacts.

There's obviously a lot of relevant mathematical detail around filters, lens resolution, sensor resolution and image reconstruction which I won't go into but the point remains; the raw file isn't pure artifice.
 
Photography is about using light to make images.

Have a look at some of the worlds best images and see their flaws!
Photography means different things to different people ;)

Some love the science of it
some love the craftsmanship of it
Some love the art of it
Some love all the above

............all love photography.
Ah but, it’s covered in @Pound Coin’s post but easy to miss.
Photography is both the art and science of creating images with light.
Scientifically, all we’re doing is capturing photons in our camera, but artistically we’re choosing which photons and even occasionally creating our own photons to capture.
 
I never understand what is meant by "100%". 100% of what? Surely that just means the whole image. I have the same problem with "100% crop" which means the whole image is cropped.

It means 1 pixel of the image is displayed as 1 pixel on the viewing device normally, however can also be understood as actual print size

Mike
 
The OP makes a very interesting point that nobody seems to have picked up upon. And that is the fact, as the OP shows, that digital capture, captures essentially...nothing.
And it does it in the same way our eyes/brain does.
It is incorrect to say pixels see only red, blue, or green. They are actually R/G/B centric and have some sensitivity to other wavelengths. This is very much the same as the way our eyes work... only they identify the cones as being long/medium/short wavelength sensitive rather than RGB (it's the same thing). Our eyes are also most sensitive to the green centric wavelength... that's why the bayer array has 2x more green centric pixels.

The fact that any of this works is kind of amazing... but I would not say any of it is false.
In any one of the processes (eyes/film/digital) there may be errors/differences, and when you combine/stack processes you also combine/stack the errors, but AFAIK there is no real solution for any/all of that nor one that could be called "ideal."
 
Last edited:
It means 1 pixel of the image is displayed as 1 pixel on the viewing device normally, however can also be understood as actual print size

Mike
OK, on my computer the screen is 4095 pixels, so full screen, the image will be 60%.
 
That said, the Shannon-Nyquist sampling theorem applies to pretty much all digitization processes. That says that if your sampling rate is sufficiently high - and that rate is rather lower than you might expect - then you have all the information you need to perfectly reconstruct the original signal, free of artefacts.
You're talking about fourier transforms and nyquist frequency... yes it applies to digital sensors; and our eyes/brain.
 
I have been thinking recently about what we see actually see when we examine digital Images.

This was brought to my attention when a recent online article used 300% images to show the difference between various digital camera outputs.

Now this immediately got me thinking back to the days I used to make giant 3 meter square enlargements for window displays. It did not matter what size film was used, as we enlarged them from anything between 13x18 cm cut film and 6x6 cm roll film. At whatever negative size we used, they were all grossly over enlarged. However from the outside of the window they all looked very good indeed. When examined close up it was easy to see the effect of the grain structure. This structure change as to the emulsion and developer used. The sharpest grain was produced by Adox R14 developed in Neofin blue, where even the edge effect of the compensating developer was apparent. While 100 asa cut film developed in D76 type developer produced a much softer granular effect.

So we are talking black and white film then? In which case nothing you took was a real image but just an interpretation of the tonal values. Would you try to make us believe that that was any more of a real representation than we see now using digital?

Mike
 
So we are talking black and white film then? In which case nothing you took was a real image but just an interpretation of the tonal values. Would you try to make us believe that that was any more of a real representation than we see now using digital?

Mike

You can believe exactly what you like..... I have no intention to make you believe anything.

Imaging using silver halide in its various forms has been used over many years, as the most fruitful way to detect and capture a vast range of the electromagnetic spectrum from the infra-red to X-rays and beyond, and has also been used to capture and trace particles like gamma rays, in much the same way as a cloud chamber can by using built up block of emulsion.

Silver halide molecules can be doped to respond to the various wavelengths of coloured light, enabling black and white and colour photography in all its forms. This research reached its pinnacle just before Digital photography displaced it almost completely. Dyes and colour couplers and various colour masking techniques have led to the development of negative- positive and transparency colour films. Even strange colour films like Technicolour which uses only filtered black and white film, and Kodachrome where each coloured layer is processed in sequence, rely on the sensitivity of the silver halide Chrystal.

If you consider that this is in some way artificial or manipulation of the truth then so be it.

But in image terms, it does not produce artifacts using calculated data. What ever transpires remains in direct proportion to photons striking silver halide crystals.
While shot noise must occur because of the random nature of the arrival of photons at individual crystals. it is rarely if ever evident because it rarely exceeds the threshold needed to produce developable silver.

Granularity is different as it is the physical property of the clumping of the crystals. The grains that we see in the finished image are the holes between those grains. not the grains them selves. while the grain structure of a film is fixed, how it is developed alters the surface of those grains which can then appear hard and sharp or soft and woolly.
 
You can believe exactly what you like..... I have no intention to make you believe anything.

Don't sit on the fence and be a politician, was it anymore real or not? what about when we used to say that the film had grain like golf balls, think of that Agfa 1000ASA slide film, now all we had was one big grain that was an average of many rays of light and gave that soft look, then there is infra red film, was that a true representation? I have my answers but reality is that digital at the top level is recording in such fine detail that it captures light at very many more sites than film ever did at equivalent ISO

Mike
 
Back
Top