- Messages
- 9,600
- Edit My Images
- Yes
this from Andy Westlake in response to forum chat about iso article in jan 2018 edition of AP
Clearly there's still a lot of misunderstanding here.
The technical definition of 'exposure' is purely about how much light reaches the sensor, which means that it's a function of shutter speed and aperture alone. Open the aperture wider, or extend the shutter speed, and the exposure is increased. Keep the shutter open for double the time and halve the aperture area, and the exposure stays the same, so 1/30sec at f/5.6 gives the same exposure as 1/60sec at f/4. This is true regardless of the ISO you choose to set on your camera, be it ISO 100 or ISO 100,000. It simply doesn't matter if you prefer to think differently - this is the technically correct definition.
The ISO setting then defines how bright the image will be for any given exposure. The higher the ISO, the brighter the image, so 1/60sec at f/4 and ISO 200 generates a brighter image compared to 1/60sec at f/4 and ISO 100, despite both having the same exposure. However, in-camera ISO settings can only be described with regard to the camera's JPEG processing. Raw files don't have any inherent ISO rating of their own, because they're not visually-meaningful images and require demosaicing, white balancing and gamma correction before they become actual photographs. Raw files can be quite happily developed to a wide range of different brightnesses and therefore ISOs, especially given the huge dynamic range of modern digital sensors.
Another myth is that ISO is intrinsically linked to in-camera electronic amplification of the signal from the sensor to increase its 'sensitivity'. In reality, the camera manufacturer's image processing can use any combination of hardware gain and mathematical manipulation to achieve the desired JPEG image brightness. Many in-camera dynamic-range expansion systems work by changing this balance, as excessive gain can irretrievably clip highlights.
Because ISO is so difficult to pin down, the current standard (ISO12232:2006) contains no fewer than five different definitions of ISO, which can potentially all give different answers. The most-used is probably 'Standard Output Sensitivity', which in effect states that an 18% grey card should be rendered as a mid-grey in the camera's JPEG output - no more, no less. But once you turn on adaptive dynamic-range balancing systems like Canon's Auto Lighting Optimiser, Nikon's Adaptive D-Lighting or Sony's Dynamic Range Optimiser, even this definition stops working and you're left with only one choice, 'Recommended Exposure Index', which broadly translates as 'Use this ISO and your images will look right'.
Increased noise at high ISO settings doesn't come as a result of extra processing of the signal by the camera. Instead, it simply reflects the fact that to get an image of a standard brightness, you use a lower exposure at high ISO. So if you're shooting at ISO 1600, you'll have used 16 times less light to make your image compared to shooting at ISO 100. But the less light you use, the higher the intrinsic 'shot noise' within the light itself turns out to be: this simply reflects the quantum or particulate nature of light. (Some 'read noise' is also introduced by the electronics along the way, but on modern cameras it's very low indeed.)
Incidentally, one logical consequence of all this is that the 'Exposure' slider in Adobe Camera Raw and Lightroom is incorrectly labelled - it should really be 'ISO', as on some other raw converters.
Made me rethink things a bit deeper
Clearly there's still a lot of misunderstanding here.
The technical definition of 'exposure' is purely about how much light reaches the sensor, which means that it's a function of shutter speed and aperture alone. Open the aperture wider, or extend the shutter speed, and the exposure is increased. Keep the shutter open for double the time and halve the aperture area, and the exposure stays the same, so 1/30sec at f/5.6 gives the same exposure as 1/60sec at f/4. This is true regardless of the ISO you choose to set on your camera, be it ISO 100 or ISO 100,000. It simply doesn't matter if you prefer to think differently - this is the technically correct definition.
The ISO setting then defines how bright the image will be for any given exposure. The higher the ISO, the brighter the image, so 1/60sec at f/4 and ISO 200 generates a brighter image compared to 1/60sec at f/4 and ISO 100, despite both having the same exposure. However, in-camera ISO settings can only be described with regard to the camera's JPEG processing. Raw files don't have any inherent ISO rating of their own, because they're not visually-meaningful images and require demosaicing, white balancing and gamma correction before they become actual photographs. Raw files can be quite happily developed to a wide range of different brightnesses and therefore ISOs, especially given the huge dynamic range of modern digital sensors.
Another myth is that ISO is intrinsically linked to in-camera electronic amplification of the signal from the sensor to increase its 'sensitivity'. In reality, the camera manufacturer's image processing can use any combination of hardware gain and mathematical manipulation to achieve the desired JPEG image brightness. Many in-camera dynamic-range expansion systems work by changing this balance, as excessive gain can irretrievably clip highlights.
Because ISO is so difficult to pin down, the current standard (ISO12232:2006) contains no fewer than five different definitions of ISO, which can potentially all give different answers. The most-used is probably 'Standard Output Sensitivity', which in effect states that an 18% grey card should be rendered as a mid-grey in the camera's JPEG output - no more, no less. But once you turn on adaptive dynamic-range balancing systems like Canon's Auto Lighting Optimiser, Nikon's Adaptive D-Lighting or Sony's Dynamic Range Optimiser, even this definition stops working and you're left with only one choice, 'Recommended Exposure Index', which broadly translates as 'Use this ISO and your images will look right'.
Increased noise at high ISO settings doesn't come as a result of extra processing of the signal by the camera. Instead, it simply reflects the fact that to get an image of a standard brightness, you use a lower exposure at high ISO. So if you're shooting at ISO 1600, you'll have used 16 times less light to make your image compared to shooting at ISO 100. But the less light you use, the higher the intrinsic 'shot noise' within the light itself turns out to be: this simply reflects the quantum or particulate nature of light. (Some 'read noise' is also introduced by the electronics along the way, but on modern cameras it's very low indeed.)
Incidentally, one logical consequence of all this is that the 'Exposure' slider in Adobe Camera Raw and Lightroom is incorrectly labelled - it should really be 'ISO', as on some other raw converters.
Made me rethink things a bit deeper