Front or back focusing issue

Messages
1,334
Name
Richard
Edit My Images
No
i was asked a question today that if you had a lens that had front or back focusing issues would it matter for landscape photography? I said no it wouldn’t because it’s not like portrait photography where your generally shooting a narrow depth of field where your focus point eg the eye has to be spot on. With Landscape depending on your focusing method eg a third into the frame or using hyper focal technique and an aperture of say f11 it wouldn’t matter if the lens slightly focused in front or behind your chosen focus point as the depth of field was much greater. Someone else then chipped in to say I was wrong. I don’t think I am but wanted to ask for my own sanity.
 
I think it depends on your technique.

Focusing a third into the frame is not a good approach for landscapes. It's a reasonable rule of thumb for photos at 'normal' distances. But if the mountains are 3 miles away, do you really want to be focusing on a subject that's 1 mile away? I suggest not.

Hyperfocal focusing is often recommended for landscapes. But then if your lens front focuses by any amount whatsoever, the depth of field won't extend to infinity. And that's not dependent on the aperture; it will be just as true at f/11 as it is at f/2.8. (What changes is the hyperfocal distance, not what happens if you focus inside it.)

I think the safest technique is to know what your hyperfocal distance is and then focus a little way beyond that. At f/11, the depth of field should then cover up any errors due to front focusing.
 
I always manual focus. This was about his thoughts. But I still say that with Landscape photography it wouldn’t matter like it would portrait photography where you go to focus on an eye and it focuses say on her eyebrow at f2.8 then the eye will be out of focus. On the Landscape it’s a different thing when shooting at f11
 
I think it only does it on autofocus. If you manual focus on a certain spot then the focus will be correct

Is this only if you use live view or ignore correct focus indicators in the viewfinder and focus on the screen.
 
Yes indeed. The back-front focus it's only your autofocus being badly calibrated. In manual you take the decision yourself.
That's not the case. If the focusing screen is not accurately calibrated to the sensor then manual focusing will also give a front/back focus error when using the viewfinder.
 
I think the safest technique is to know what your hyperfocal distance is and then focus a little way beyond that. At f/11, the depth of field should then cover up any errors due to front focusing.
IMO that's the only way of using HFD these days... you're guessing at the distance anyway.
 
I think ive learnt something new today ,so if a lens back or front focuses by using manual focus what you see is what you get?
Yes.
Front/back focus is caused by the PDAF module which is a completely different optical system (it uses parts of the main lens, but in a different way and with it's own additional optics). If you use the viewfinder (or CDAF) you are eliminating the PDAF system from the equation. However, viewfinder focus isn't as accurate typically due to the very small image, you cannot see the DOF for a lens wider than ~f/2, and the focus confirmation is still using the PDAF system.
 
Last edited:
I said no it wouldn’t because it’s not like portrait photography where your generally shooting a narrow depth of field where your focus point eg the eye has to be spot on.
I would tend to agree that minor misfocus would not matter for many/most landscape type shots.
Technique wise I would suggest it is best to focus on whatever you feel the main subject is unless there is something very near that you also need acceptably sharp (then use HFD).
 
That's not the case. If the focusing screen is not accurately calibrated to the sensor then manual focusing will also give a front/back focus error when using the viewfinder.
That's a separate issue... a misaligned focus screen will not cause the AF to front/back focus, it is only for the viewfinder/manual focus (and metering).
 
Last edited:
That's a separate issue... a misaligned focus screen will not cause the AF to front/back focus, it is only for the viewfinder/manual focus (and metering).

Correct, Steven, but the post I quoted stated differently.

Bob
 
That's not the case. If the focusing screen is not accurately calibrated to the sensor then manual focusing will also give a front/back focus error when using the viewfinder.

I guess you're right. That not something we hear of that much with dlsr nowadays but i guess it is still a possible problem. Surely it was a big one with film cameras.
 
I guess you're right. That not something we hear of that much with dlsr nowadays but i guess it is still a possible problem. Surely it was a big one with film cameras.
You used to be able to get shims for the focusing screen and a ground glass sheet to insert in place of the film. More recently Canon would provide shims for the 5D and 5DmkII but they stopped when the screens became less user friendly to change. The 1srs bodies are manufactured to a tighter tolerance and shims were never an option (according to my CPS rep when I requested some).

Bob
 
So a mirrorless camera would cure the focus issues ;)
Yes, because by definition a mirrorless camera uses the main imaging sensor to autofocus.
But a conventional DSLR does exactly the same in Live View mode, so you don't need to switch cameras to achieve this.
 
please expand Steven,in simple as possible terms for this dunce :)
I'll try ;).

Phase detection works by comparing the position (phase) of separate virtual images which are created from different portions of the light... basically, if you think of "the V" of light in a typical diagram, where the base is the light being in focus; there is a complete/separate (virtual) image in both legs of the V.
For on sensor PDAF to function a certain number of pixels have to be dedicated to sensing/comparing these separate images... this is a dedicated data stream/function so those pixels are lost from the image (re-mapped the way hot pixels are mapped out). The more AF points there are, the more that is lost from the image stream (but even 1000 pixels lost is relatively minor in a 40MP system). The more significant issues are that because the sensor is using (essentially) two images composed from all of the light at the point where they (should) converge, there's not much movement to detect (phase shift). Additionally, because it is using all of the light (or 1/2 each) the image sharpness and DOF of the lens/lens aperture will affect the accuracy of on sensor PDAF. As will the amount of light transmitted by the lens.

The PDAF module uses a bunch of lenses and sensors to farther isolate multiple images to compare with greater accuracy/speed. Because of how the system works it is not dependent on lens aperture to control the amount of light (only that the aperture is not smaller than a particular sensor point is designed for). And it is not dependent on the lens aperture for DOF/image sharpness. The lenses/system has their own aperture/DOF which is somewhere around f/7 in terms of light transmission, and over f/22 in terms of DOF (aperture numbers are relative to main lens FL/aperture function). And because the virtual images are taken from smaller portions of the total light they have greater potential offsets/phase shift for greater accuracy. I.e. an f/2.8 AF sensor point uses images from the f/2.8 area of the objective lens, which has greater separation and accuracy than the images used by an f/8 AF sensor point.
 
Last edited:
I'll try ;).

Phase detection works by comparing the position (phase) of separate virtual images which are created from different portions of the light... basically, if you think of "the V" of light in a typical diagram, where the base is the light being in focus; there is a complete/separate (virtual) image in both legs of the V.
For on sensor PDAF to function a certain number of pixels have to be dedicated to sensing/comparing these separate images... this is a dedicated data stream/function so those pixels are lost from the image (re-mapped the way hot pixels are mapped out). The more AF points there are, the more that is lost from the image stream (but even 1000 pixels lost is minor in a 40MP system). The more significant issues are that because the sensor is using (essentially) two images composed from all of the light at the point where they (should) converge, there's not much movement to detect (phase shift). Additionally, because it is using all of the light (or 1/2 each) the image sharpness and DOF of the lens/lens aperture will affect the accuracy of on sensor PDAF. As will the amount of light transmitted by the lens.

The PDAF module uses a bunch of lenses and sensors to farther isolate multiple images to compare with greater accuracy/speed. Because of how the system works it is not dependent on lens aperture to control the amount of light (only that the aperture is not smaller than a particular sensor point is designed for). And it is not dependent on the lens aperture for DOF/image sharpness. The lenses/system has their own aperture/DOF which is somewhere around f/7 in terms of light transmission, and over f/22 in terms of DOF (aperture numbers are relative to main lens FL/aperture function). And because the virtual images are taken from smaller portions of the total light they have greater potential offsets/phase shift for greater accuracy. I.e. an f/2.8 AF sensor point uses images from the f/2.8 area of the objective lens, which has greater separation and accuracy than the images used by an f/8 AF sensor point.
Well, thanks for that,i appreciate your atempt to make it easy to understand.will reread a dozen or so times and maybe it will make sense to me lol.
So i will stick with my d3300 :rolleyes:
 
I drew up a few diagrams... they are an oversimplification of the PDAF system, but they convey the main concepts simply.

The first diagram shows how two images shift position on an AF sensor point/grouping with changes in focus. This position/phase shift is how the camera knows if an image is in focus or front/back focused, and therefore which way to drive.

The second diagram shows how the angular offset is greater for AF points designed for faster apertures. It also shows the FOV/virtual apertures of the various PDAF module's lenses (which determine the amount of light received by the AF sensor). Because the angle is greater for the wider source locations, there is more shift in position with AF changes, and therefore greater accuracy.
Note that in this drawing, if the lens is at f/2.8 or wider the PDAF system is using all three pairs for AF. At f/5.6 it will be using 2 pairs, and at f/8 one pair. In some systems the f/5.6 pair will be for one line direction (i.e. horizontal) and the f/8 pair will be for the other direction (vertical)... and when they are both in use the focus points are "cross types." But if the wider pair is not available the focus point reverts to a line type (the specifics vary with the individual AF system design).

The third diagram shows the basics of on sensor PDAF (which is also how split prism viewfinder screens work).

Untitled-1.jpg

It helps if you understand that every point on the objective element contains a complete image as seen from that location. The size of a lens' objective element only determines the maximum amount of light transmitted (max aperture), not the FOV. i.e. all lenses of the same FL transmit the same image regardless of the size of the objective element. And a larger objective element (a faster aperture) increases exposure by combining more images onto the sensor. At max aperture the added images come from the edges of the lens and may show some of the lens barrel, which causes vignetting.

An image made from only a portion of the total is called a virtual image, and an image made up from all of the light is called an aerial image. What we normally think of as "focus" is maximum contrast in the aerial image (CDAF) as that is how our eyes work.
 
Last edited:
Back
Top