Jump to content

What's the case for gamma today?


Recommended Posts

<blockquote>

<p>An ICC profile is not the only method for color correcting the output from the camera, but something at least similar will be required—at minimum a color correction matrix.</p>

</blockquote>

<p>@Joe and Andrew: You are correct, that was poorly worded. I should have prefaced the ICC comment with 'For the purposes of this discussion...'. When I ask why gamma correction is needed in a self contained system that starts in x bits and does not downsize bit depth, I do not just mean starting from a 12 bit raw file. The same reasoning applies also if you start with a 16 bit linear TIFF (perhaps generated by your RC of choice). Would it not be better for your data if Photoshop (or other PP software) were to receive a linear TIFF and work in linear space?</p>

Link to comment
Share on other sites

  • Replies 171
  • Created
  • Last Reply

Top Posters In This Topic

<blockquote>

<p>And I've made a 1.0 gamma profile in Photoshop's CustomRGB in Color Settings for these supposedly linearized images from these RC's. It's not very good. More work than it's worth in getting it to look right.</p>

</blockquote>

<p>@Tim: I see what you mean, but I am not talking about the relative merits of various RCs or the process of rendering a raw file. If it makes it easier to understand, start from a 16 bit linear TIFF whose data has never had gamma applied to it, if you prefer. It does not mean that there haven't been corrections made to it to make it 'look' better, but it is in linear space - which according to my understanding so far means it has the least noisy, most accurate, densest and highest resolution data it can have, especially after all the tweaking that was necessary to get it to look that way. Why apply gamma to it? It will only degrade it and make it worse.</p>

<p>So is gamma still needed in 2010, if you are like me? Perhaps the answer is no?</p>

Link to comment
Share on other sites

<blockquote>

<p>Who and what determines proper linear handling of a demosaiced Raw image in determining its linearized appearance? </p>

</blockquote>

<p>Whoever wrote the raw converter however the data is linear (there’s no way around that). </p>

<blockquote>

<p>There's no ground zero for representing unmanipulated linear sensor data. It's all interpreted.</p>

</blockquote>

<p>True in terms of a color space, but the data is at capture, the way the photons are counted, linear. </p>

<blockquote>

<p>The RC Raw Developer's linear setting makes all properly exposed Raw images appear dark without a gamma correction profile assigned on top of other settings including an additional tonal curve to give a normalized appearance.</p>

</blockquote>

<p>Because they want to provide the tone curve as part of the output referred rendering. They don’t have to. If you zero out all the ACR settings (note the default setting for brightness at 50), you get closer to that scene referred “dark” look. <br>

There is a lot of interpretation going on during demosaicing, as I said, the RC has to at this point assume what the color space the filters combined represent. </p>

<blockquote>

 

</blockquote>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<blockquote>

<p>And I've made a 1.0 gamma profile in Photoshop's CustomRGB in Color Settings for these supposedly linearized images from these RC's. It's not very good. </p>

</blockquote>

<p>RC being Raw Converter? Why ProPhoto primaries? Do you know that’s the correct set of primaries its using ?</p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<p>@Andrew and Tim: I hear you. This thread, however, is about why a self contained photographer's working color space should be distorted by a power function with an exponent of gamma instead of not.</p>

 

<blockquote>

<p>If you wish to simulate the physical world, linear-light coding is necessary.<br />On the other hand, if your computation involves human perception, a nonlinear representation may be required</p>

</blockquote>

<p>Hello Jacopo. This is indeed the crux of the matter, and isn't the representation of a capture a simulation of the physical world? In the context of 12/14/16 bit linear data being post processed in 16 bits, when and why would a non-linear representation be required in a photographer's working color space, other than, potentially, as a very last step conversion for the sole benefit of the output device?</p>

Link to comment
Share on other sites

<blockquote>

<p>This is indeed the crux of the matter, and isn't the representation of a capture a simulation of the physical world?</p>

</blockquote>

<p>Not a simulation of physical world, but a simulation of perceived world.<br>

For example contrast and brightness are perceptual.<br>

The discrete cosine transform for jpg is performed on gamma encoded data.</p>

 

Link to comment
Share on other sites

<blockquote>

<p>Not a simulation of physical world, but a simulation of perceived world.</p>

</blockquote>

<p>@Jacopo: Why would you need to present to our eyes such a simulation? Our objective is to present to our eyes the nearest facsimile we can of the relative luminance that was at the scene (which of course we will perceive virtually the same way, logarithmically). Same relative luminance (perceived brightness) and contrast = same perception. That means an overall system gamma of 1. So why raise our data to any exponent other than one, let alone 2.2?</p>

<p>Jpeg is different: it is a lossy compressor, which means you are willing to throw data away in order to have smaller files - might as well encode it with something close to an effective gamma of 1/2.2 to make it more perceptually efficient. But that's not our case. We start with 12/14/16 linear data and stay in 16 bits. There is no perceptual advantage in encoding OUR data with gamma. Is there?</p>

Link to comment
Share on other sites

<p>jack, why not use a Raw converter that allows converting to a custom 1.0 gamma ICC profile you can make in Photoshop? As long as the image looks correct and as intended, the profile will always show the correct preview (in color managed apps only) with the data remaining linearly encoded.</p>

<p>Andrew, RC=Raw Converter. I don't use or assume ProPhotoRGB primaries using Raw Developer's linear setting. The data is written in sRGB or maybe monitor RGB. When opening this Raw Developer linear tiff, no assigning of a canned or Photoshop CustomRGB primaries profile will make an X-rite CCchart test shot look correct. You have to use a more sophisticated profiling package to measure off this tiff file to build the ICC profile.</p>

<p>Wonder if there's a way to use Adobe's DNG Profile Editor with Raw Developer tiff?</p>

Link to comment
Share on other sites

<blockquote>

<p>Andrew, RC=Raw Converter. I don't use or assume ProPhotoRGB primaries using Raw Developer's linear setting.</p>

</blockquote>

<p>I’m confused then by this:</p>

<blockquote>

<p>And I've made a 1.0 gamma profile in Photoshop's CustomRGB in Color Settings for these supposedly linearized images from these RC's. It's not very good.</p>

</blockquote>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<blockquote>

<p>why not use a Raw converter that allows converting to a custom 1.0 gamma ICC profile you can make in Photoshop?</p>

</blockquote>

<p>@Tim: yes, if that works it may be a good idea (as Andrew suggested). I am just wondering why everybody is using 1/2.2 gamma color spaces instead. Am I missing something?</p>

Link to comment
Share on other sites

<p>Just a note about this, I started tinkering around with Raw Developer's linear setting that turns everything off, but still allows base tone curve, saturation, hue, RGB curve adjustments, setting black, neutral and white points along with a gamma slider.</p>

<p>Guess which tool reveals the most shadow detail with the least amount of noise close to black point?</p>

<p>The gamma slider!</p>

Link to comment
Share on other sites

<p>The best example of an all linear workflow would probably be Adobe Lightroom. With no extra actions required by the user, a digital raw file and all processing steps stay at 1.0 gamma up until the images are exported.</p>

<p>The displayed RGB coordinates in Lightroom are in MelissaRGB (ProPhoto primaries but sRGB tone curve), though, which I am not sure I agree with since not much uses this color space either inside or outside of Lightroom.</p>

<p>I cannot confirm that the issue still exists in Photoshop CS4 or CS5, but in the past if an image were resized in a gamma 2.2 space, the resized image would be <em>incorrect</em>, darker than it should. There may be more at stake with gamma than just rounding errors.</p>

Link to comment
Share on other sites

<p>No, Shadforth, you don't need to know this to be a digital photographer.</p>

<p>If you think this thread was determined on the subject of gamma, you should've been around in discussion with Timo Autiokari...</p>

<p>http://www.poynton.com/notes/Timo/Concerning_Timo.html<br>

...may your eyeballs bleed and mind melt.</p>

<p>There were some heated arguments going back at least a decade on this subject of a linear workflow between Adobe alumni engineers, digital evangelists and enthusiasts and this guy.</p>

<p>For some reason Timo's website's domain name is up for sale. Go figure. Looks like he lost the argument.</p>

<p>Consider these kind of talks similar to car repair enthusiast's insisting on the use of synthetic oil and titanium splitfire ignitor spark plugs over regular the regular kind.</p>

Link to comment
Share on other sites

<blockquote>

<p>Guess which tool reveals the most shadow detail with the least amount of noise close to black point?</p>

</blockquote>

<p>Of course, what other tool does photoshop have that gives you infinite amplification at the origin (thus extremely high amplification in the dark regions of your picture)? It will amplify both the signal and the noise equally at equal starting values, but is there a lot of signal in the darkest parts of our pictures? No. There is a lot of noise however (thermal, shot, read, amplifier, reset, etc.). These being the closest values to the origin, guess what gets amplified the most by gamma? That's why sRGB, Melissa, L*a*b* all have linearized curves near the origin. aRGB and Photoshop do not, however. But this begs my question. If there are no advantages in our situation to a gamma corrected color space, then why correct it in the first place, creating unnecessary discontinuities, quantization and increasing noise in the shadows?</p>

<blockquote>

<p>I don't know what else to tell you except that it's a PITA to do it this way</p>

</blockquote>

<p>Perhaps I am missing something. Why would it be such a PITA? All you need is to choose a working color space with a gamma of 1. Everything else stays the same. I just wonder why everybody isn't doing this in 2010. Perhaps there are good reasons. What might they be?</p>

 

Link to comment
Share on other sites

<blockquote>

<p>@Jacopo: Why would you need to present to our eyes such a simulation? Our objective is to present to our eyes the nearest facsimile we can of the relative luminance that was at the scene (which of course we will perceive virtually the same way, logarithmically). Same relative luminance (perceived brightness) and contrast = same perception. That means an overall system gamma of 1. So why raise our data to any exponent other than one, let alone 2.2?</p>

</blockquote>

<p><br />Jack, think to camera exposure. Why makers don't use linear scale?<br />You can answer they use a logarithmic scale.<br />But why?<br />The answer is they use a "perceived" linear scale not a "physical" linear scale.<br /><br />If you can approximate a "perceived" space, in that space the things are going linear. <br /><br />JoeC wrote:</p>

<blockquote>

<p>The displayed RGB coordinates in Lightroom are in MelissaRGB (ProPhoto primaries but sRGB tone curve)</p>

</blockquote>

<p><br />If I'm right MelissaRGB is used only for histogram building. Unlucky choice I think.<br />Displayed RGB coordinates are in monitor gamut.</p>

Link to comment
Share on other sites

<p>The year being 2010 with its cheap processing power and 16 bit (or even 32 bit) colour depth makes absolutely no difference to the basics. Those basics being that <em>we actually want to be able to view our pictures</em>, and that there is no practical viewing device that shows the same contrast ratio or brightness range as we see in real life.</p>

<p>Forget the claims of LCD monitor makers of greater than 1000:1 contrast ratios. That's just nonsense and the manufacturers know it! If you actually measure the brightness range with a photometer, you'll find the average monitor manages about 300:1, at best and in a darkened room. Paper prints fair even worse, probably scraping just over a 100:1 range in normal lighting conditions and far worse if framed behind glass. Therefore a modest 7 stop subject brightness range (128:1) needs some gamma adjustment to fit onto a paper print, and anything over 8 stops needs help to be shown on a computer monitor <em>as</em> <em>viewed in a darkened room. </em>If we view the LCD display in normal room lighting then its contrast ratio drops to little better than a paper print. That's why by default most modern LCD displays emulate the old CRT gamma of 2.2 or thereabouts.</p>

<p>In short, we still need gamma! As proof of this, look at the popularity of HDR techniques, which represent gamma gone mad.</p>

Link to comment
Share on other sites

<blockquote>

<p>There were some heated arguments going back at least a decade on this subject of a linear workflow</p>

</blockquote>

<p>@Tim: that's a good site, I am a fan of Poynton's, probably one of the most authoritative around on the subject. From the link you provided, this is one of the first quotes that I ran into:</p>

<blockquote>

<p>Linear intensity coding is fine if you can afford 12 or 14 or 16 bits per component, but if you have only a limited number of bits per component - 8, say - you must code nonlinearly to get decent performance.</p>

</blockquote>

<p>We have been able to afford 16 (ok, 15 in Photoshop) bits for a few years. Gamma is needed to counteract the physical characteristics of your output device or if you need to COMPRESS your data. We photographers do not want to compress our data (that's lossy compression, by the way). We want to maintain its integrity as much as we can so we always have the best, densest, least noisy etc. data as a base to play with. Would a sound engineer mix a new track for a SuperAudio CD utilizing an MP3 compressed version of the piece as a source, instead of the master digital 24 bit linear track? Of course not. So why do we almost do that in our PP software?<br>

.<br>

@Jacopo: perception does not come into the equation until AFTER the output. I am asking why use a gamma corrected color space BEFORE the output (if ever - it depends on the type of output). Given the state of the art in 2010, IMHO it is easier and less noisy to work on linear data up to that point. <br>

.<br>

Hi Rodeo: I agree with most of what you say. But, similarly to Jacopo, you are talking about the output device (where we are often stuck with non-linear properties that need to be compensated).<br>

.<br>

I am talking about our internal working color space: for instance, why apply a power function to your linear data when you leave your raw converter to go into your favorite PP program, and then why use a gamma corrected color space within it? Where are the benefits? This is not a rethorical question. I am asking because perhaps there is a fault in my reasoning and I am more than happy to change my mind if someone can come up with a good SPECIFIC reason for why. Anyone?</p>

Link to comment
Share on other sites

<blockquote>

<p>@Jacopo: perception does not come into the equation until AFTER the output.</p>

</blockquote>

<p>I don't agree.<br>

The way software modify data depends on image values (gamma encoded values are different from linear values).<br>

So the software have to select the better way. And following the selection it have to give you a right scaled slider.<br>

For example, there are situation where software can work in Lab color space.<br>

You have no any control on this. And this is beneficial as I think software try to make the best choice.</p>

 

Link to comment
Share on other sites

<blockquote>

<p>If I'm right MelissaRGB is used only for histogram building. </p>

</blockquote>

<p>Correct, that and the RGB percentages. Not real useful. </p>

 

 

<blockquote>

<p>save us! Who gives a rats? I thought "gamma" was what killed you when an A bomb went off.<br />Do I have to learn about this, now, to understand digital photography?</p>

</blockquote>

 

<p>Some of us do want to know what’s going on under the hood (as some of us wanted to understand how analog photography worked, or mixed our own chemistry). No one is focusing you to read or attempt to comprehend the subject of gamma or TRCs. If you don’t want to, don’t! </p>

<p>As for Timo, just about anyone besides Timo, at least those in the image processing world (a slew of Adobe engineers) dismissed most of his ideas. So there’s no wonder his site has slipped into the ether.</p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<blockquote>

<p>And following the selection it have to give you a right scaled slider.<br />For example, there are situation where software can work in Lab color space.</p>

</blockquote>

<p>Jacopo, with regards to sliders, histogams, controls etc. I agree: ask the programmers to scale them whichever way it is most perceptually intuitive. And when post processing your linear data feel free to apply whatever corrections or to convert into whatever color space you feel is most appropriate (keeping in mind the PCS round-trip penalty discussed in an earlier post) to obtain the result you want. But why start off with distorted data? Why should Lightroom (which by some accounts keeps data linear until the end) need to apply gamma before passing data to Photoshop or another post processing program? Why should any RC or PP program unless it is required by the ouput? Why not pass a nice 16 bit tiff or equivalent in a suitably sized linear color space with gamma=1?</p>

Link to comment
Share on other sites

<blockquote>

<p>Why should any RC or PP program unless it is required by the ouput? Why not pass a nice 16 bit tiff or equivalent in a suitably sized linear color space with gamma=1?</p>

</blockquote>

<p>Do you think all are color managed?<br>

If the answer is no, it is always "required from output" .</p>

<p> </p>

Link to comment
Share on other sites

<p>All right, let me ask the question a different way (again, not a rhetorical question). In this fairly comprehensive <a href="http://www.brucelindbloom.com/WorkingSpaceInfo.html">list of color space specifications</a> why isn't there one titled 'Modern Digital Photographer's Post Processing Working Color Space' with a gamma equal to uno? Why isn't there a single one with a gamma of one :-)? Legacy? Or what?</p>
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...