Jump to content

Sensor color capability


Recommended Posts

Not sure what terminology to use but what I want to know is the limits of the color space when you convert a RAW image. You can assign it anything you like, but how much of the available color space, say using ProPhoto, can a given sensor make use of? I'm assuming it has to do with how good the Bayer filter is, but no idea about what else might matter. Can't find any references or tests of different cameras. Or do I not understand the whole concept?
  • Like 2
Link to comment
Share on other sites

Basically, I don't know the answer Conrad, and I suspect almost nobody else does either. Because the commonly used 2D CIE 'horseshoe' model of colour-spaces (dating from 1931!) is fundamentally incapable of showing that information.

 

Theoretically, an RGB space that encompasses said horseshoe would need impossible, and monochromatic, primaries that fall far outside of its boundary. That's if trichromatic theory is to be swallowed wholesale.

 

The dye-based filters used on camera sensors obviously can't be too narrowly cut, otherwise they'd lose far too much light to be practical. Plus their centre-frequencies probably don't align exactly with the Prophoto primaries. So, who knows?

 

I don't think the sensor or camera makers are going to 'fess up to their colour systems being a bit substandard.

 

Just FWIW. Here's a collection of daylight spectra taken through a simple slit+prism spectrometer, using a variety of digital cameras.

(Top spectrum is a simulation of what sunlight should look like)

And below that, the spectra as seen by the digital cameras in my collection.

Camera-renders.jpg.428101180b4b2a300bc333f99216f2fb.jpg

You'll notice that hardly any two of them match closely, and you'll see that there are gaps a mile wide where monochromatic yellows and cyans can pass right through unnoticed.

Edited by rodeo_joe|1
  • Like 1
Link to comment
Share on other sites

That's really interesting! I have some gratings and prisms and have taken casual shots of sun-lit spectrums, but have never been rigorous about it. I did notice that my Z6 did better in the blue than my antique D200. I've also got a monochromator so could probably get comparisons at specific wavelengths, but I don't know if that's useful or not. Doing any of this under controlled conditions so the experiments can be repeated is tough. I wonder if @digitaldog has any thoughts?
Link to comment
Share on other sites

It does not begin nor end with the camera sensor. The human side of the viewfinder is very much involved.

 

Color theory is a complex and wonderous thing

A not-spectacular Wikipedia article does introduce the topic generally:

 

Color theory - Wikipedia

 

where it says

Color theory has not developed an explicit explanation of how specific media [emphasis, JDM] affect color appearance: colors have always been defined in the abstract, and whether the colors were inks or paints, oils or watercolors, transparencies or reflecting prints, computer displays or movie theaters, was not considered especially relevant.

 

There is a marvelous, in my opinion, book that deals with the BIOLOGY of seeing :

 

Margaret Livingstone's Vision and Art: The Biology of Seeing. (revised and expanded edition) 2014

 

Among other things, this has some discussion of color.

 

Although this is much broader than the OP's question, it is fundamental.

  • Like 2
Link to comment
Share on other sites

Just FWIW. Here's a collection of daylight spectra taken through a simple slit+prism spectrometer, using a variety of digital cameras.

(Top spectrum is a simulation of what sunlight should look like)

And below that, the spectra as seen by the digital cameras in my collection.

[ATTACH=full]1368527[/ATTACH]

You'll notice that hardly any two of them match closely, and you'll see that there are gaps a mile wide where monochromatic yellows and cyans can pass right through unnoticed.

I can't get colors to match from the same camera using different Raw converters.

  • Like 2
Link to comment
Share on other sites

I think the OP question boils down to: what is the color response of the various digital cameras sensors and how is it modified in-camera. I don't have an answer, but I'm very confident that the camera manufacturers know exactly what it is, but are not sharing it with us. Same issue with dynamic range, where we need to depend on third party test houses. Talk about transparency or lack thereof!
Link to comment
Share on other sites

Personally, I think some of the fundamentals of human colour perception still haven't been properly researched - CIE horseshoes notwithstanding.

 

For example: What is the objective limit of (average) human perception of colour saturation? And in what units?

 

It seems to me that 'saturation' can be defined in two ways.

1. The narrowness of bandwidth of a spectral colour, and how wide that bandwidth can be made before it's perceived to have changed.

Or

2. How much a monochromatic colour can be 'diluted' with white light before it's perceived as differing from the pure colour.

 

I find no absolute definition of the boundary of the CIE 'horseshoe' in either of those terms. It appears to be an absolute limit, but with no measurable units put to it. And given that the creation of bright and monochromatic light sources was near impossible in 1931, hardly surprising!

  • Like 1
Link to comment
Share on other sites

Also, we're all different. Since having cataract surgery, my vison is probably better than when I was a teenager. Sure, I can't focus up close without glasses for that distance, but my ability to see slight color differences in various so-called whites and pale yellows is excellent and I see further into the blue than before. The downside is a standard HD monitor is annoying. I just got a new 2K for photo editing and it's a big improvement. 4K is still more $$ than I'd want to spend.
Link to comment
Share on other sites

Many blue LEDs, often used in Christmas lights and advertizing signs, are much more blue than usual blue light bulbs.

 

I believe that some of the filter properties can be fixed in the digital image, though at best you still get a triangle out of the CIE horseshoe.

 

I have wondered about using more filters to get more of the CIE color space, but not likely to happen.

-- glen

Link to comment
Share on other sites

Just a thought.

Many of us will doubtless have old books with, by today's standards, appallingly poor colour illustrations. We'll also likely remember early colour television and maybe even the first two-colour Technicolor films (movies).

 

In my view, colour technology has come a long way in the last few decades and on the whole is much more realistic, and cleaner and brighter, than it used to be.

 

I'm not saying it's everywhere perfect, nor that progress should stop; but I do think we might consider whether 'wider' colour spaces are necessary for our purposes, and whether we'd actually notice any small incremental improvement?

Edited by rodeo_joe|1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...