Jump to content

If sRGB is so limited why not use DirectX?


Recommended Posts

<p>Angie, in broad lines, yes; the monitor is in most scenarios the limiting factor, but it can also be a program imposing a limit. Which is part of the answer on your second point: "..it'd be better to have the bits there and provide an opportunity for programs to generate more than 8 bits pre channel". Yes, if all else is equal, it would be better.<br>

But not all is equal: at present, monitors with 10-bit per channel colour depth are rare and expensive; this will not change quickly because the (visual!) advantages aren't such that the wide® audience will be easily swayed to buying them. Also the videocards supporting 10-bits per channel are still limited. The operating systems supporting it are limited, Windows 7 was the first, and I think only the latest version of OS X supports it (but my Mac-knowledge is poor - I might be off by a version, or 2).<br>

The last bottleneck is programs: for most programs there is no advantage whatsoever to provide the more accurate output, while the extra "output bits" can mean a performance penalty. And as Andrew said: the advantage of 10-bit is only there if every single step in the process supports it (application, OS, videodriver, videocard and monitor) - else it reverts to the usual 8-bit.<br>

So, the only programs I know of so far that actually really can use the 10-bit path are high-end graphical applications (because those can benefit a bit). For all other applications, the market is too small, the advantages too. All in all, not a scenario that will see rapid changes, in my view.</p>

Link to comment
Share on other sites

<p>Here's my shot: Nothing we ever see is a single wavelength on the spectrum, if it was there would be no such thing as white, brown, or grey. What reaches our eyes is actually a bunch of different wavelengths of light combined for everything. The cones in our eyes then respond with different sensitivity patterns to the different wavelengths for each of 3 types of cones. This is called the tristimulus. Every color we perceive is a product of this tristimulus which are roughly red, green, and blue. However, since each of these cones respond to a range of wavelengths, and the ranges overlap, and there is other biological weirdness, no definition of red green and blue is able to encode every color we can perceive without encoding a significant chunk of colors we can't perceive, and maybe even some colors that can't exist, like they need negative light in some wavelengths.</p>

<p>sRGB is one definition of red green and blue, that is intentionally a little narrow so that it's similar to a "normal" computer monitor range of colors. Your monitor has it's own definition, determined by the colors of it's primaries. With a numerically-measured "color profile" colors from a working color space like sRGB or a wider AdobeRGB can be converted to a universal (to human vision) but physically impossible system called a connection space and then out to the color numbers of an output device like a monitor or printer. The better those profiles are, the closer their output will be to each other, if all colors can be rendered by both devices.</p>

<p>DirectX and OpenGL both operate in complete ignorance of any of the above, usually resetting the graphics card color calibrating Look Up Table to defaults on start, and therefore always operate at the full gamut of the screen with no consideration for getting the colors right at all. New versions of DirectX and Windows may allow output to 10 bit screens, but the result is just measuring the same monitor primaries between the same 0(full off) and 1(full on) with a couple more bits of precision after the decimal point. The result is yes 64 times as many in-between shades of the same range of colors (hence the claim of "more" colors). However, our eye-brain system can't even percieve the difference between nearby shades of the existing 256 shades per primary, so it will not create new colors.</p>

<p>Why are there 10-bit screens and possibly graphics cards then? We sense things like lights and sounds in a logarithmic rather than linear scale: 6, 12, 25, 50, 100% looks more like 20, 40, 60, 80, 100% to our mind. In the graphics card and in a modern flat panel the color numbers are mapped on a function to compensate for us, through a Look Up Table, and when this function is done at onlt 256 shades of precision it doubles up some of the color numbers on the output and leaves gaps at some others because of rounding errors. 10-bit cards and screens reduce this rounding error to make sure we get *at least* 256 shades in each primary.</p>

Link to comment
Share on other sites

<p>These are all good links to explain color and how it works in practice with monitors. Thank you for your thoughtful responses to a somewhat confused line of questioning.</p>

<p>On a practical level I second Andrew's suggestion to get a high quality wide gamut monitor. I'm quite happy with my calibrated Nec screen. It's much better than the CRTs I switched from and now I can see colors beyond sRGB that used to be more or less imaginary to me (I knew I was editing in ProPhoto to future-proof my work but the colors never changed between that and sRGB before.)</p>

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...