Jump to content

Dynamic range of current Canon sensors


kanellopoulos

Recommended Posts

Ocean Physics is wrong. Some writers are now using latitude in this way but the OED gives the meaning of latitude in the photographic sense as:

 

"The range of exposures for which an emulsion, printing paper, etc., will give acceptable contrast; spec. the ratio (or its logarithm) of the exposures between which the characteristic curve is straight."

 

This seems to agree with Sheppard and Mees though they use overexposure region and underexposure region (which they admit just be defined arbitrarily) (due to the toe and shoulder of film).

 

The problem seems to be a change in the common meaning of latitude. It was originally used as a synonym simply for range with an older meaning meaning "full extent".

Link to comment
Share on other sites

If you are planning on shooting landscapes on a tripod I would try out the Photoshop HDR

feature before buying grad filters. I used it to shoot a conference room which the client

wanted to show off the outside view. The interior properly exposed completely blew out

the outside view. When exposing for the outside the inside was pretty much black.

 

I bracketed 8 stops, put the files into Photoshop and it did the rest. The only drawback is

you need multiple exposures.

Link to comment
Share on other sites

"Which kind of nullifies the utility of the method if there is any movement in the scene, like wave or ripple action on water, or vegetation moving in a wind."

 

It doesn't nullify it at all. It all depends upon how you paint your layer mask that blends the two images. You can take the entire surface of the water from one exposure and the entire sky from another. Both clouds and water could have moved significant amounts from frame to frame and the effects of that movement would not show up at all in the final product.

Link to comment
Share on other sites

Mark all I can tell you is blending exposures is a technique that works for a lot of people, myself included. I will be glad to provide you examples if you want.

 

Is blending applicable to every concievable situation? No, there are limitations. But I have found the number of times where it is possible exceeds the times where it isn't. And as far as being difficult, you can blend two images in less than 30 seconds. I'll grant you that getting the best results takes longer but the point is, its not hard to do.

 

What you really have to be careful of is camera movement during exposure. A half way decent tripod, mirror lockup, and a camera with an auto bracketing feature makes that childs play to achieve.

Link to comment
Share on other sites

One more comment and then I am going to cease and desist.

 

A week ago I took a series of shots in Av mode. I had exposure compensation set to underexpose 2/3 of a stop because I was leery of blowing highlights in a couple of areas. Every single shot in that series was shot at 1/80 second at f5.6. Every shot except for one that is. For whatever reason the camera decided to underexpose that shot by an additional two stops and used a shutter speed of 1/320 sec.

 

According to Andrews theory the underexposure would not have mattered. But in the real world that extra two stops of underexposure rendered the photo useless. Sure I could (and I did) adjust the exposure in a RAW converter. And I did end up with a recognizable image. But the noise was awful and the color was worse. It was an photo of such poor overall quality no amount of post processing could save it.

 

I will concede that you can recover data over an 8 to 9 stop range. But that is a very different thing from having 8 or 9 stops of USABLE range. And you might as well not have it if you can't use it. If a shot is underexposed by two or three stops, I can't use it. Its a waste of time trying. You can say what you will but and I verified that for myself a very long time ago. If you are happy with the results you get trying to recover poorly exposed photos, all I can say is good luck and knock yourself out.

Link to comment
Share on other sites

I just ran a quick test of the dynamic range of my 5D. I set up a scene (with white cards, black textured camera case, lamps, etc.) that showed a difference of 7.7 stops between the lightest part of the white card (but not a specular reflection) and the darkest part of the camera case (which was well shaded from the light source). With some exposure compensation, I was able to find an exposure that only clipped the very brightest parts of the highlights and the very darkest parts of the shadows. The texture in the dark shadows was almost impossible to see without adjusting the curves, and even then, it wasn't really robust or natural-looking. When I adjusted the exposure compensation in either direction, one end or the other of the histogram was clipped off (blown highlights or black shadows).

 

Given that my spot meter readings may not have been completely accurate in measuring the dynamic range of the scene, it seems reasonable that the sensor might be capturing 8 to 9 stops.

Link to comment
Share on other sites

<i>Mike, I would recommend experimentation as you don't seem to believe me.<P>

 

With RAW conversion, I have been able to 'save' images that were underexposed up to 8 stops, and retain detail if not tonality. Go ahead, try it. Put your camera in M mode, meter something, and take a picture. Then take another one that is 2 stops under. Do it again. Then do it a fourth time. If you're using a 10D like I am, you may even be able to go deeper than 8 stops under medium exposure. </i><P>

To be fair, I went back and followed the procedure Andrew described with the scene I had set up. The results:<P>

At 2 stops under: Adjusting with the RAW converter, I was able to get a scene that looked fairly "normal" except the shadows had less detail (lower values had no detail) and more noise.<P>

At 4 stops under: After adjustment, the "whites" were now very light grey, "midtones" (what had been midtones in the proper exposure) were dark and muddy, lower tones had no detail except for noise.<P>

At 6 stops under: After adjustment, the "whites" were now about middle grey, "midtones" were very dark and with little or no detail, lower tones had no detail except for lots of noise.<p>

At 8 stops under: After adjustment, the "highlights" were now the only thing showing any detail, "midtones" and lower were nothing but murky noise.

Link to comment
Share on other sites

I come from an electronics background dealing with the design of A-D and D-A convertors.

 

I may be missing the question here but the way I see this, it is really simple.

 

The digital sensor and associated electronics have a certain dynamic range. You want that dynamic range to capture the areas of interest in your composed shot. If you set the exposure to far positive or negative you run the risk of not only getting the wrong exposure but pushing things into postive clipping (over exposure) or negative clipping (under exposure).

 

When clipping occurs the information that goes beyond the clipping level is lost forever. You cant recover what was never recorded.

 

The sensor has a finite dynamic range. It is simple really.

Link to comment
Share on other sites

I certainly don't think you can salvage a shot that is 8 stops underexposed. The bottom 4 stops contain 16 bits of information. That is certainly not enough to give any kind of quality and that is ignoring the fact that a lot of the response will be noise.
Link to comment
Share on other sites

Alistair has come the closest to hitting the nail on the head.

 

It may be true that there are '12 bits' of image data in the conversion, and the sensor amplitude may even reasonably track a 12-stop range of output. I'd even grant you the benefit that it may even be roughly linear and noise-free (but that's a whole other can of worms).

 

So, let's assume that we have captured a scene with only the bottom bit of data -- essentially 11 stops underexposed (this is a stretch, because most A/D converters are only accurate to plus/minus 1 1/2 counts in the least significant bit). At that point, there are only two states of output -- 0 or 1. That might be fine for line art, or a copy machine, but it's pretty much a waste for photographic purposes.

 

So, how many bits of data is enough to judge a scene to be 'pretty good', or at least recognizeable? This is somewhat subjective, but something around 64 to 128 levels is judged 'good enough' by folks in various tests, when shown pictures confined to only 64 or 128 luminance levels. Depends a lot on the viewer, and the content of the photo, of course. 64 is a magic number in the machine vision industry, by the way, for software algorithms to do reasonable amounts of edge-finding, etc.

 

So, given the benefit that someone might think a 64-level photo is OK, then we've just used up 6 bits of our 12-bit output just to get a useable image.

 

That leaves, um, let me see now -- about 6 bits of so-called latitude, or 6 stops.

 

If 64-level photos offend you (they certainly offend me), then you might wish to err on the safe side, and reserve at least 7 or 8 bits (128 or 256 levels), minimally, to represent your image. That leaves then about 5 or 4 bits for exposure latitude, or 5 or 4 stops, respectively.

 

In case you're wondering, I spent 10 years in a previous life, writing machine vision software -- I can think about this stuff in my sleep. I wish we had 12-bit converters in those days. Instead, we used 10-bit converters (internally) that then produced 8-bit output, accurate to plus/minus 1/2 bit. To do good machine vision, the exposure had to be spot on, or there wasn't enough information in the image to do good vision processing. "Good" in our case, meant repeatability to 1/10 pixel.

 

I suspect that in the Canon cameras, they are actually using a 14-bit converter, to produce their 12-bit output. Otherwise, there wouldn't really be 12-bits of useable data coming out of the converter. Even then, dark noise and other sensor phenomena causes some amount of noise that must be dealt with somehow creatively, to get any kind of good latitude.

 

Using A/D converters with more bits and throwing away the least significant bits is like scientific calculators, that have 'guard digits' that are never displayed to lessen round-off errors in their calculations. It's a common industry strategy.

Link to comment
Share on other sites

  • 2 weeks later...

May I rephrase the question? What is the textural range of Canon sensors? How many f/stops does it take to get highlights and shodows with detail recoverable from a RAW file?

 

B&W have a textural range of more than 13 f/stops. Ansel Adams photo "Mrs. Gunn on Porch" had (according to him) values that fell on zone 13 , which he managed to squeez out of the negative using waterbath development. I think color negative has even more, because I dont ever remember blown out highlights on a color negative film.

 

I did the test from Ansel Adams book "The Negative". You basicaly shoot a uniformly lit flat textured subject (a white towel in my case) based on a spot meter reading (this is your grey zone V). Then increase exposure by opening the aperture by one stop or decreaing shutter speed by one stop. 5 exposures over and 5 under (11 f stops). All in Raw+jpeg ISO 100. Color space was AdobeRGB. Raw conversion was in Rawshooter premium and Adobe CameraRaw (better job on the shadows)

 

Oh, and the Camera is Canon 5D.

 

I got the following results

 

- Zones 10 and above (255 white in jpeg). Hopeless. Fully washed out. Using linear raw conversion will put very little detail back into zone 10. Not practical.

- Zone 9 washed out in JPEG. But in Raw you can bring detail back easily

- Zone 1 (values <25 in jpeg) can be opened up in Raw converter (increasing exposure. Adobe CameraRaw is much better in this regard than Rawshooter) and brought to zone levels 2 and 3 with clean results.

- Zone 0 (total black in the jpeg file)is not really 0 in Raw. You can open up the detail in this zone up to zone 2 but noisy.

 

Conclusion:

- Textural range of the 5D at ISO 100: 10 stops. Last bottom stop needing some noise removal

- One strange thing. The Spot meter on my 5D reads one full stop lower than a hand held meter.

- Unlike the days of film, where I followed common wisdom of expose for shadows and develop for highlights. For Digital, its expose for the highlights and open up the shadows.

- When doing street photography on sunny days I always set my 5D at 1EV lower than meter indicates to keep it from blowing out the highlights on peoples faces. The Canon metering system can use some imporvments.

 

 

Can someone else repeat this test please? I would like to find out more about this subject.<div>00FHDz-28206884.jpg.d07398ae14624e84f8722956b3c3e1bd.jpg</div>

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...