Jump to content

Why does color accuracy matter?


Recommended Posts

<p>Please forgive me this obvious question.</p>

<p>I am considering purchasing a monitor, which of course in these days are usually LCD monitors.<br>

Now I see that most of the LCD monitors are not very color accurate, so I am pondering getting a monitor which is more color accurate. Currently most of my photography is not done digitally anyway, but I do get negatives scanned.<br>

The very obvious question is where in the photography process does the color accuracy matter? Is it because :</p>

<p>(1) The photographer has some special mind and can't stand to see the colors wrong when he looks at them on the screen, sort of like professional musicians who can't stand instruments out of tune?<br>

(2) The photographer is actually manipulating the colors on screen and hence if they aren't accurate it will make little Johny's face green or some such thing?<br>

(3) Matching the printed output to the image on the screen is important?</p>

<p>If I just have someone else manipulate the images and I just resize them on my computer, it doesn't matter too much, does it?<br>

Can't the software tell if the colors are right or is this something that frequently has to be adjusted by hand?</p>

<p>I'm thinking of either the viewsonic VP2365wb or perhaps dell ultrasharp u2410.<br>

I don't even need anything this big (23" or 24"), but it seems the better technology goes into the larger monitors. <br>

Any comments on these?</p>

<p>Thanks,<br>

Michael Hoffman</p>

Link to comment
Share on other sites

<p>Accurate color is a marketing buzzword by and large. My idea of accurate color requires something known as colorimetry, that being the idea we use instruments to measure a color, ideally describing its spectrum. We can compare that to other colors but the rub comes when you decide you want to have accurate (not matching or pleasing color) on our computer system. Measured color is known as Scene Referred. Its generally butt ugly when you view it, as is on a display system or send it directly to some output device. Output referred is the name of color that has been optimized for a display or output device. So we have a big disconnect between accurate, scene referred color and output referred color (for more examples, see:http://www.color.org/ICC_white_paper_20_Digital_photography_color_management_basics.pdf)</p>

<p>We generally want to things. One is pleasing color. The representation on screen or print of an image as you recall it looking like at the scene. we also want matching color (my display and my print appear to match). To get either, the numbers between the two items are never the same. </p>

<p>What you want in terms of an "accurate" display is one you can calibrate and profile to a high degree of consistency so that the RGB numbers you see today are the same in a year. You want a display that allows you to calibrate it using the least mucking around, ideally in high bit (to avoid banding) and using software controls that let you specify a contrast ratio, white point and luminance that provide a visual match to how you are viewing the print next to that display. </p>

<p>For this task, I'd recommend something like the NEC SpectraView II line with their software for calibration and profiling, and a supported instrument. If you search on these forums, you'll get a lot of info about this line of product. </p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<p>I'd say #2 and 3 are correct.</p>

<p>If you have no intention of doing edits beyond resizing and don't mind if images don't look the way the image creator intended, don't calibrate your monitor. If you care, buy a basic device like the Eye One Display and it will be good enough even with a basic screen.</p>

Link to comment
Share on other sites

<p>Michael:</p>

<p>It's important to me to be able to look at my monitor and know what the resulting print will look like. I want neutral colors to be neutral and not have any color casts. If I want something slightly warm, I want it slightly warm. I want it neither neutral nor very warm.</p>

<p>In the end, the cheaper monitors deliver more bang for the buck. It's easier to produce something to 90% perfection than 95%. That extra 5% might double the price. Getting to 98% may increase the price tenfold.</p>

<p>Eric</p>

Link to comment
Share on other sites

<p>I disagree that cheap monitors are at all useful! The display is the only window into your data, and you should expect that if you view a group of RGB values today, they should look exactly the same next week and in a year. Displays are unstable devices. They need regular calibration. The quality of the display profile plays a huge file in color matching a print when soft proofing in ICC aware applications. For the price of a bit more ram, which may speed up operations maybe a few minutes a day, one can get a really good, high end, reference display with an instrument like the NEC SpectraView (and even their entry level P221W, an excellent value, cost little more than a slightly larger and inferior display). Don’t skimp on the display! </p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<p>The main purpose for caring about color accuracy is if you do printing for clients, by either printing it yourself or if you have it printed professionally (not at Walmart). This insures that what you see on the screen is what the client will see on the print.</p>

<p>A secondary reason is if you personally care about the accuracy of your image for any reason at all, maybe if you make prints to sell or just want accurate colors for the web.</p>

<p>But if neither of these two things are important to you then save your money and get a cheap TN LCD.</p>

Link to comment
Share on other sites

<p>At a certain point it's just a matter of degree. You can get all worked up about having "perfect" color accuracy and for some that is necessary and important. For many others it's just a matter of how accurate. For most, I agree with Roger Smith above FWIW. If you are doing critical color correction, then you will want to throw the resources into achieving that. If you just want to look at photography with a fairly accurate idea of how the creator intended it, a standard calibration will generally serve. I do know that monitors do matter though. At work we have these 22" HP LCD's and they are but ugly in terms of screen clarity and color fidelity. But a decent mid road monitor will serve many.</p>
Link to comment
Share on other sites

<p>I'm on both sides of the fence when considering this overall concept of color accuracy.</p>

<p>Personally, I prefer the words "Color Consistency."<br>

I am a tad shade blind, in other words; navy blue and black appear almost the same to me.<br>

So what to do? Well; I know to preserve some detail in blacks, I set my black levels at 5-5-5 or set auto-clip to this level..same goes for white, 244-244-244.</p>

<p>What is red to me may not be the same red to you.<br>

It is for this reason I find the words "color accuracy" kinda' silly save scientific measurements.</p>

<p>I've viewed images on high quality CRT monitors and high quality LCD monitors. Other than contrast changes when viewing off angle, I can't see a big difference..I'm sure it's there, but I don't see it, so a expensive CRT monitor is a waste of money for me.</p>

<p>After a lot of experimentation and experience with my workflow, I produce consistent pleasing and re-producible results. I white balance manually when it counts which reduces my odds of a color cast.<br>

The agencies I license to have never complained about out of whack color balance.</p>

Link to comment
Share on other sites

<blockquote>

<p>You can get all worked up about having "perfect" color accuracy and for some that is necessary and important. </p>

</blockquote>

<p>There is accurate color, of which I’ve tried to explain is quite different from consistent color. Again, displays are not stable devices. Left alone, the RGB values you see today and the values you see in a month, 6 months will change. Doesn’t matter if you print them or not, the idea is to view what a computer understands, a pile of numbers, and make some decisions about them. ICC aware applications need a display profile to do this. The display profile has to be updated on a regular basis after calibration (getting the device back to its desired condition). R239/B89/G12 is not the same color in sRGB as it is in ProPhoto RGB or Adobe RGB (1998). The only way an ICC aware application can even begin to show you those colors correctly (forget accurate) and consistently is by calibrating and profiling your display. Higher end, not necessarily vastly more expensive display solutions do this with a one button system which saves you time and ensures you are calibrating to the same target and thus, seeing the same numbers correctly and consistently. That means buying an instrument and software. If you look at the cost of a good solution (EyeOne Match with an EyeOne Display-2), factor that into the final solution, then look at the bundle of an NEC with their instrument and software, (considering the ease of use, added control and a high bit, wide gamut panel), you’ll find that you are not shelling out exorbitant money for what is the only device that allows you to view that big pile of numbers representing your image! </p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<p>Thank you all for such interesting responses.<br>

What are some of the examples of color changes that you do typically where the color accuracy of the monitor matters?</p>

<p>Is it a matter of trying to get the colors to be accurate or something more than that?</p>

<p>Is it knowing whether or not the photo exposure and the resulting colors are good, period?</p>

<p>How does this line up with what people used to do in the darkroom?</p>

<p>I've only done color processing in the dark room once or twice and it took an hour to come out with one photo that still didn't exactly have the right colors, so it's kind of hard to imagine adjusting it manually in photoshop.</p>

<p>It's easier for me to understand B&W - either it's too dark, too light, etc.</p>

<p>Thanks</p>

Link to comment
Share on other sites

<p>A good monitor will provide more than just color accuracy (or consistancy), it can also provide a wider range of colors. This is known as gamma and, again, is usually only important to those who are actual artists/professional photographers who want to see all the colors possible from their image.<br>

TN LCDs use 6 bits/color, VA LCDs 8 bits and IPS LCDs either 8 or 10 bits/color, with the higher bits/color able to display more colors (and less color banding).<br>

Gamma is usually rated in % of AdobeRGB, with the best LCDs having a rating of approx.125%. Again, the higher the % the more colors displayed.<br>

But the question you need to ask yourself is do you have a need or desire for this kind of color quality, and how much are you willing to pay for it?</p>

Link to comment
Share on other sites

<p>"What are some of the examples of color changes that you do typically where the color accuracy of the monitor matters?"</p>

<p>Sure, overall color balance is one. Say you have a cool foggy scene and your monitor is displaying things bluish you'll overcorrect and make it look too warm for the photo mood. You may screw up "memory colors" like skin tones, foliage and water because your display is inaccurate, leading you to oversaturate or correct them in a way that looks unnatural to others. You may also have a totally inappropriate level of brightness and contrast for a print leading to the endless complaints here of "my prints are too dark." I speak from experience from the bad old days of "calibrating" a screen by eye- I had all of these problems and more.</p>

Link to comment
Share on other sites

<p>Don't disagree with anything you say Andrew. I wasn't saying calibration is not neccessary or that you don't have to re-calibrate on regular basis. For me, it doesn't matter what the scientific difference is between "accurate" and "consistent" color. I just want my 3 monitors to be consistent with each other and my prints. If those 3 sufficiently match up, I will call that accurate, whether its scientifically valid or not. I do the emperical test of looking at friends viewing my work in an ICC aware application or web site (not firefox) like Safari and if they are sufficiantly consistent, I'm happy with color management.</p>
Link to comment
Share on other sites

<p>So, I've been mulling this over a bit.</p>

<p>A few more quesitons:</p>

<p>If all my images are in sRGB and I do all my work in sRGB , I gain nothing from a Wide gamut monitor, do I? Except if I somehow like to look at images from someone else who is using a different gamut like adobe RGB or pro-photo? Does a better wide gamut monitor display better sRGB in color aware applications? Or is it worse?</p>

<p>Has this whole wide gamut thing new with LCD screens? The same thing doesn't happen with CRTs at all?</p>

<p>When processing RAW files, what do people typically use for the Gamut? Is it typically sRGB ? I know there are huge debates about this, but I presume you have to chose a gamut to work in when you process RAW files? And that's the point at which you have to choose, right? I also presume the JPG files coming out a digital camera are mostly sRGB ? Or can you typically chcose in the camera setups?</p>

<p>Also, after reflecting on the whole tagged or non-tagged images on the Internet thing. What people say doesn't make sense to me. People say the problem are all the untagged images, but if we presume that the majority of those are RGB, isn't the problem on the interpreting end instead of the image end? Why can't the color correction assume that an untagged image is RGB, say hey, I know how to handle RGB and display it as such? It seems more sensible to do this, than to display the majority of images (RGB) badly for the rare cases where someone uses something else. What am I missing in this?</p>

<p>Thanks for your help.</p>

<p> </p>

Link to comment
Share on other sites

<blockquote>

<p>If all my images are in sRGB and I do all my work in sRGB , I gain nothing from a Wide gamut monitor, do I?</p>

</blockquote>

<p>Nope.</p>

<blockquote>

<p>Does a better wide gamut monitor display better sRGB in color aware applications? Or is it worse?</p>

</blockquote>

<p>Technically worse. </p>

<blockquote>

<p>When processing RAW files, what do people typically use for the Gamut? Is it typically sRGB ? </p>

</blockquote>

<p>I use ProPhoto RGB because raw can capture scenes who’s gamut is large that easily fall outside sRGB. If you shoot JPEG, set the camera to sRGB, that’s moot. If you are shooting raw, you most certainly have data that falls outside sRGB unless you only shoot on foggy days if you get my drift. </p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<p>I would say accurate color = intended color. I find the best definition of color, defined in the book Real World Color Management.</p>

<blockquote>

<p>Quote:<br />"Color is an event that occurs among three participants: a light source, an object, and an observer. The color event is a sensation evoked in the observer by the wavelengths of light produced by the light source and modified by the object. If any of these three things changes, the color event is different - in plain English, we see a different color."</p>

</blockquote>

<p>Colorimetry and devices that measure color cannot measure the sensation evoked in the observer but they can measure accurately the wavelengths of light reflected by the object. So the technology can at least ensure that the same color information is sent to the different observers.</p>

<p>Intended color in digital cameras and scanners are defined by by their manufacturers who create complex algorithms to convert the raw color information from the wider spectrum and dynamic range that the cameras can capture to the limited gamut and dynamic range of the current display devices intended for human perception. Certain color standards like sRGB are used as a target for this conversion.</p>

<p>Intended color in digital paintings, edited photos, and design is the color the creator see on the device used as a feedback for the creation (usually monitors).</p>

<p>The only reliable way to ensure that you see colors as intended and others can see colors you create, requires proper color management from all parties sharing the color information.</p>

<p> </p>

<blockquote>

<p>If all my images are in sRGB and I do all my work in sRGB , I gain nothing from a Wide gamut monitor, do I?</p>

</blockquote>

<p>On theory no, but on practice actually you gain a lot. The reason for this is that a wider gamut monitor can be used to represent the narrower sRGB gamut much more accurately than the native sRGB monitors which in reality do not actually cover precisely the sRGB gamut. If their colors are wrong or change overtime you don't have additional colors to pick from for replacement. Not only that but the wider gamut allows and usually comes with other monitor enhancements and goodies that increase the overall image quality. I have Dell u2711 which is a wide gamut monitor and the sRGB content is simply stunning in comparison to native sRGB monitors that I've seen.</p>

<blockquote>

<p>Does a better wide gamut monitor display better sRGB in color aware applications? Or is it worse?</p>

</blockquote>

<p>It is better if you do have a proper color management in place for the reason I already said above, but it is worse if you don't have a proper color management. <br />Because, unlike sRGB monitors, there is no wide gamut standard target for manufacturing wide gamut monitors, a hardware color calibration device with a good color management software is a must.</p>

<blockquote>

<p>When processing RAW files, what do people typically use for the Gamut? Is it typically sRGB ?</p>

</blockquote>

<p>Depends on how the image will be used and how capable the person editing the image is. If it is intended for viewing on sRGB monitors and the person has good skills and taste, the best thing is to use is a properly calibrated sRGB feedback and play with the controls until the image looks as best as possible. <br />If the monitor or the skills cannot be trusted then conversion to sRGB is the closest alternative. This is similar to what digital cameras automatically do when saving photos on the card as JPGs but each manufacturer tries to come with better algorithm than the competition to make the images from their cameras more pleasing.</p>

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...