Full frame & diffraction

Messages
4,907
Name
Simon
Edit My Images
No
I'm contemplating moving from m4/3 to a FF system for a large number of reasons including the ability to make larger prints.
One of the issues with m4/3 is that diffraction starts to be noticeable at f11 or even wider if you go looking for it.
At what point does it become an issue with a 24mpx FF sensor? or a 36mpx sensor?
Does removing the anti-aliasing filter help?
 
Working to 'round number' apertures, I find f/11 to be the last one where diffraction isn't too bad, f/16 is starting to be an issue (but if I need it for depth of field I use it) and f/22 I avoid like the plague and merge two files together for depth of field.

(24mpx FX sensor)
 
A bit OT but how is a FF sensor going to give you larger prints ?
 
A bit OT but how is a FF sensor going to give you larger prints ?

It's a bigger sensor. As you blow an image up you can really only make it look worse and for any given print size you have to blow a smaller sensor image up more... therefore in theory and in practice and if the technologies are anywhere near equal an image taken with a camera with a bigger sensor will look better when printed bigger than will a picture taken with a camera with a smaller sensor.
 
you might be able to google to find the answer, 24mp apsc is limited by around 7.1or f8 i think

the intresting thing for me is the "suposed oh we will have 100mp cameras sometime" would mean being diffraction limited at really low f stops

we might hit a wall just like film did
 
Diffraction is inevitable. It occurs at all apertures. However it is the point at which resolution is diffraction limited.that no more detail is available, and the increase in diffraction blurring becomes greater at the point of focus ,than any advantage gained from decreasing the aperture further.
However depth of field continues to extend at the new diffraction limited maximum detail.

On a full frame there are advantages to be gained shutting down as far as a particular lens allows.which is usually about one stop further than the point at which it becomes diffraction limited.

an f16 lens may well be diffraction limited at f11 and a f22 one at f16 the reason being that a f16 lens will normally have a shorter focal length than a f22 one.
 
you might be able to google to find the answer, 24mp apsc is limited by around 7.1or f8 i think

the intresting thing for me is the "suposed oh we will have 100mp cameras sometime" would mean being diffraction limited at really low f stops

we might hit a wall just like film did

My point exactly! I don't want to buy a 36mpix body only to find that I can print massive but fuzzy. I think I'm going to have to hire something and do the tests.
 
If you look at resolution testing for lenses a full frame camera starts to drop off at f8 although it's really only noticeable past f11-f16

With micro four thirds you state f11 but realistically its f4 but again more noticeable at f5.6/f8
 
Some good info here
Thank you - I'd seen that article before but not part 2, which goes on to explain that diffraction is independent of resolution and focal length. It's always possible to downsize a 36mpix image to 24 and the effect of diffraction would be the same, so choosing fewer larger pixels on the grounds of avoiding diffraction would be a folly.

Frankly, I should have been able work that out for myself but I hadn't thought about it properly.
 
Thank you, real world info is useful. I'm currently pushing 24x36 with careful upsizing from m4/3 at f6.3. The results are ok but could be better.

I've printed 24x16 on crop at f16 and its fine until you are up close, for hanging in your home and stood back a bit, they look great.

You'll have no issue with a D800, decent lens and prints that size, none whatsoever.
 
I've printed 24x16 on crop at f16 and its fine until you are up close, for hanging in your home and stood back a bit, they look great.

You'll have no issue with a D800, decent lens and prints that size, none whatsoever.
Thanks! I'm aiming for at least 36x48 and currently toying between the Sony A7II or the A7rII when it appears, assuming that they do everything else they need.
 
Thanks! I'm aiming for at least 36x48 and currently toying between the Sony A7II or the A7rII when it appears, assuming that they do everything else they need.

I've not done any that big, but they should be ok.

TBH sharpness comes from proper, accurate focussing, get that right, and you will be fine. Either of these bodies are fine.

Edit: PS Here is my fine art america full res file from a Skye shot. Ok its sharpened a little but shot at F11. Buy a print LOL and see how it looks

http://1-stephen-taylor.artistwebsites.com/featured/garreg-ddu-reservoir-stephen-taylor.html
 
Last edited:
Canon 5D3, I can "just" see diffraction problems at F11 when pixel peeping in Lightroom. In print, I can't really see it. That said, I only go higher for particular starburst effects. For my G7X its pretty certainly limited after 5.6.
 
That said, ive just found two reasonably similar examples of f11 to f22.

F11


F22


Nothing much in it, but boy did I wish I had cleaned my grads better.
 
Last edited:
Some good info here

Indeed some usefull information, but gives the impression that diffraction is dependant on a lens at all.
diffraction occurs when ever light passes an edge. The light that passes through the centre of a hole or lens is not diffracted.

On small holes the proportion of light passing the edge of the aperture is greater than the proportion passing the edge of a large aperture. So shows greater effect.

the light that passes through the center portion is undiffracted and always forms a sharp image, however it is overlaid buy the diffracted light forming the airy disk, and becomes progressively degraded.

It is for this reason that some camera makers are able to counter diffraction to some extent in firm/software. By reverse engineering the effect.
This is not as easy as it sounds as diffraction occurs at all edges in a lens system not just at the aperture.
 
you might be able to google to find the answer, 24mp apsc is limited by around 7.1or f8 i think

My best lenses on a 24MP APS-C sensor are obviously a bit more diffraction softened at f8 than they are at f5.6. I haven't bothered to check in smaller increments than a stop, so it might be the case that the best lens at a fraction of a stop more or less than f5.6 is even sharper.
 
Indeed some usefull information, but gives the impression that diffraction is dependant on a lens at all.
diffraction occurs when ever light passes an edge. The light that passes through the centre of a hole or lens is not diffracted.

On small holes the proportion of light passing the edge of the aperture is greater than the proportion passing the edge of a large aperture. So shows greater effect.

the light that passes through the center portion is undiffracted and always forms a sharp image, however it is overlaid buy the diffracted light forming the airy disk, and becomes progressively degraded.

It is for this reason that some camera makers are able to counter diffraction to some extent in firm/software. By reverse engineering the effect.
This is not as easy as it sounds as diffraction occurs at all edges in a lens system not just at the aperture.
No, I'm afraid that's just not true! The less of a lens you use, eg by only using its centre, the worse the image will be. This is related to the optical resolution of an imaging system. You're right that edge effects are important and show up diffraction related phenomena, but the edge is not what causes diffraction. The wave nature of light does - it might sound like nit picking, but it's an important distinction with real world outcomes.
 
It is for this reason that some camera makers are able to counter diffraction to some extent in firm/software. By reverse engineering the effect.
This is not as easy as it sounds as diffraction occurs at all edges in a lens system not just at the aperture.

Fwiw it sounds like a deconvolution - or perhaps an iterative reconvolution - problem. These are computational expensive in 1 dimension, let alone 2, though it's possible a dedicated chip with GPU like hardware could do the job in reasonable timescales. Or it could be offloaded to a graphics card on importing into LR (etc).
 
No, I'm afraid that's just not true! The less of a lens you use, eg by only using its centre, the worse the image will be. This is related to the optical resolution of an imaging system. You're right that edge effects are important and show up diffraction related phenomena, but the edge is not what causes diffraction. The wave nature of light does - it might sound like nit picking, but it's an important distinction with real world outcomes.

Diffraction is not caused by lenses, it is caused by waves passing an edge. it can be clearly shown by waves in water passing through a gap. It Even shows the interference patterns that develop.

It is used in science with diffraction gratings In numerous ways.

if you restrict a lens with a small aperture you increase the proportion of the light waves that pass an edge.
With no aperture, those same light waves would pass through that space undefracted.
the effect is caused by the presence of the edge, not the lens.
 
Last edited:
Diffraction is not caused by lenses, it is caused by waves passing an edge. it can be clearly shown by waves in water passing through a gap. It Even shows the interference patterns that develop.

It is used in science with diffraction gratings In numerous ways.

if you restrict a lens with a small aperture you increase the proportion of the light waves that pass an edge.
With no aperture, those same light waves would pass through that space undefracted.
the effect is caused by the presence of the edge, not the lens.
No, diffraction effects are intrinsic to wave propagation in general. They do not require an edge to occur. If you underfill a len, so that there is no intensity at the edges, it will still behave like a stopped down lens, despite the lack of interaction of any light energy with the edge of the lens. It's a form of self interference during propagation and it leads to the finite size of the airy disk.
 
Fwiw it sounds like a deconvolution - or perhaps an iterative reconvolution - problem. These are computational expensive in 1 dimension, let alone 2, though it's possible a dedicated chip with GPU like hardware could do the job in reasonable timescales. Or it could be offloaded to a graphics card on importing into LR (etc).
Yes, it's likely deconvolution based on the measured point spread function of the lenses.
For Fujifilm some info is here:
http://fujifilm-x.com/development_story/en/processor/
And here:
http://fujifilm-x.com/development_story/en/developer/lens_modulation_optimiser/

Some key things of note: They show multiple examples of the point spread function (PSF) and mention measuring it for different focal lengths (I'd imagine some lenses would also need some measurements for different focal distances too). The chip architecture also shows a vector graphics accelerator. These are ideal for speeding up fast Fourier transforms. If you have a lookup table of PSFs and a way to do FFTs efficiently in hardware, then you can do a pretty good deconvolution to remove some of the lens artifacts. It's definitely a sensible way of going about things.
 
As I understand it diffraction is produced when light passes an edge, such as the iris edge in lens. But what happens to diffraction if the sharp iris edge is replaced by the softly graded edge of an apodization filter, as in the Sony/Minolta 135mm STF lens?
 
As I understand it diffraction is produced when light passes an edge, such as the iris edge in lens. But what happens to diffraction if the sharp iris edge is replaced by the softly graded edge of an apodization filter, as in the Sony/Minolta 135mm STF lens?
Diffraction is not really about edges at all - that's a simplification that's given at secondary level and its not really helpful. It's really about what happens to finite size beams as they propagate. The size can be due to the size of the source or because they passed through an aperture. The soft edges of an apodisation filter affect the out of focus areas but diffraction still occurs. One thing that will happen is that the lens will behave as if its slightly stopped down with an apodisation filter in place.
 
On small holes the proportion of light passing the edge of the aperture is greater than the proportion passing the edge of a large aperture. So shows greater effect.

Sorry guys, I know very little about this, but the statement above makes no sense whatsoever. Surely a larger hole has greater edge area than a smaller hole, so the amount of light hitting the edge must be greater ?

I'm happy to be educated :)
 
Sorry guys, I know very little about this, but the statement above makes no sense whatsoever. Surely a larger hole has greater edge area than a smaller hole, so the amount of light hitting the edge must be greater ?

I'm happy to be educated :)
There is a larger edge in a larger aperture, but it's smaller in proportion to the amount of light not passing the edge. Edge increases linearly with diameter of aperture, whereas area of aperture increases as the square of the diameter. But if Cuchulainn is right then this is an oversimplification as an explanation of why smaller apertures diffract more. Or why overexposed lights shot with a polygonal aperture produce diffraction starburst rays.
 
Think of a bunch of parallel rays passing through a lens and getting focused down to a spot. The rays in the middle are hardly deviated at all, while those further from the middle are increasingly bent in order to get to the focus. The size of the focus spot (the Airy disk mentioned above) is related to the range of angles of the light rays that form it. The wider the range of angles the smaller the spot - this is why stopping down the lens decreases the resolution - the high angle rays are blocked by the aperture and cannot contribute to the image. Note that this is not due to the edge, but due to the light being blocked entirely. If you follow this image forming process through mathematically you find that the relationship where the angle of the light at one point of the system is converted to a spatial extent elsewhere is equivalent to something called a Fourier transform. Knowing this, you can then both look at how the shape of the aperture causes things like starbursts and how you can use knowledge of the airy function for the lens to improve the resulting images.
 
the following pages answer all the above questions and doubts. Diffraction is far more a concern for astronomers than photographers who are plagued by it.
Astronomical telescopes Especially small ones are badly effected by Diffraction
See....http://www.rocketmime.com/astronomy/Telescope/ResolvingPower.html

You will note that this has nothing to do with lenses but the light passing through the opening.

Secondly, any edged or obstruction in the optical path of a telescope forms additional complex diffraction
http://www.beugungsbild.de/diffraction/diffraction.html

In a camera objective, all the openings tend to be near circular and the only obstructions are the edges themselves, while the form of those edges can cause changes to the pattern of the diffraction it can not reduce it.
Polygonal apertures in camera lenses produce star busts for the same reason obstructions do in telescopes.
Of course an infinite sized aperture would not cause diffraction. But all real lens constructions do.
 
Think of a bunch of parallel rays passing through a lens and getting focused down to a spot. The rays in the middle are hardly deviated at all, while those further from the middle are increasingly bent in order to get to the focus. The size of the focus spot (the Airy disk mentioned above) is related to the range of angles of the light rays that form it. The wider the range of angles the smaller the spot - this is why stopping down the lens decreases the resolution - the high angle rays are blocked by the aperture and cannot contribute to the image. Note that this is not due to the edge, but due to the light being blocked entirely. If you follow this image forming process through mathematically you find that the relationship where the angle of the light at one point of the system is converted to a spatial extent elsewhere is equivalent to something called a Fourier transform. Knowing this, you can then both look at how the shape of the aperture causes things like starbursts and how you can use knowledge of the airy function for the lens to improve the resulting images.


Nice but absolute nonsense.
Diffraction and airy disks are not caused by lenses.

You are being confused by the difference between DIFFRACTION and REFRACTION that occurs when light passes from one medium to another.
Two very different phenomena.
 
It's not just the aperture that has an impact on the image sharpness, it's also the anti-aliasing filter and chip used as well. There was a discussion several years ago on the Luminous Landscape web site about this. Some very serious physics was discussed, way over my head. However the upshot was a simple (ish) formula that you could use to find the optimum aperture wit regards sensor/ lens. It turns out for my 21Mp Canon's the optimum aperture is f9.5. Now I have shot at much smaller apertures, and to be honest f11-f16 is fine for most images. I haven't gone pixel peeping , but generally I shoot around f8 anyway unless the subject warrants a different setting
 
Nice but absolute nonsense.
Diffraction and airy disks are not caused by lenses.

You are being confused by the difference between DIFFRACTION and REFRACTION that occurs when light passes from one medium to another.
Two very different phenomena.
I'm really not. This is definitely one area where I know more than you.
 
I'm not, I'm describing fourier optics through self-diffraction at apertures. The lens defines the wavefront as curved, creating the focus. The diffraction then occurs through the self interference of the wavefront as it propagates to the focus. This image formation theory explains why the rays from the periphery increase the resolution if the system. The edges are not, repeat, not the source of diffraction. The spatial extent of the wavefront is. This is not me making stuff up, this is part of how I design, build and operate adaptive optics compensated super-resolution microscopes. Go back and read my earlier posts, you might learn something.
 
Back
Top