True Color

Discussion in 'Digital Darkroom' started by jerry plunk, Jul 20, 2015.

  1. i am attempting to capture a color, i'll call it PMS 490, and how it is affected by its environment during a sunny and during an overcast day. i want to see exactly how 490 looks when affected by those elements. i do not want the camera to alter the color or the monitor to alter the color or the printer to alter the color. but i do want to see how the direct sun and indirect sun (sunny and overcast) affects that solid color.
    my thought is to purchase the ColorChecker Passport by Xrite so i can dial back in a profile that was recorded at the very moment i shot the PMS 490. that way i can see, without any other influences, how the sun/lack of affected that solid color.
    what are your thoughts on the most scientific way to go about achieving this and arriving a a true color as it is affected by the sun and overcast?
     
  2. You ask a very difficult answer, and unfortunately there aren't very good answers anyone can hand you.
    Short answer: There is no true color.
    Longer answer:
    First you need to study a few things about color. I suggest you read about metamerism to understand more about how color depends not only on the reflective/absorptive properties of the target and the spectrum of the light hitting it (which you seem to understand), but also of the absorption spectra of the detectors (e.g. the photopigments in the cones in your retina) and the response properties of those detectors.
    Once you understand these things, you will probably realize that the imposition of absorption spectra in a camera sensor, the spectral output of your monitor's channels, and/or the absorption spectra of the dyes or pigments from your printer impose additional variables that would make it possible to characterize the printed or displayed output as anything "true." You can only create metameric equivalents to your stimuli that will fail under certain circumstances. (Google "metameric failure.")
    Piled on top of all of that is your own visual system's adaptation to the spectra of different light sources. Incandescent light doesn't look orange to us, nor does fluorescent light look sickly green (unless compared side-by-side).
    Still another factor is that of your definition of "sunny" and "overcast." There are degrees of "sunny," with different spectral properties. And there are especially different degrees and properties of "overcast."
    In my opinion, the only way to compare is to forget the camera and the monitor and printer. Use your eyes. You need a side-by-side comparison under the two different real-world lighting conditions. Unfortunately you must simulate "overcast" (however you define it) by filtering/modifying the "sunny" light.
    That said, I will offer another short answer: Sunlight is spectrally complex (or "rich"), and so is filtered sunlight. I suspect your PMS 490 will look the same to your adapted eyes under either lighting condition. Metameric failure would seem unlikely, given those two light sources. If you were comparing sunlight with some variety of fluorescent, for instance, then your PMS 490 could take on considerably different appearances, depending on what it is.
    But again, truly, don't even think of involving a camera, monitor, or printer in all of this!
     
  3. Better would be describing what you're trying to accomplish presumably digitally.
    IF you wanted to know what PMS 490 looked like under daylight and over cast, you'd take a sample of that color under those conditions and just look at it. But I suspect you want to do something that involves a digital process which means RGB numbers to define this and now we have to get into colorimetry and color management.
     
  4. And for that using a spectrophotometer is better than using a camera.
     
  5. Oops! Editing error:
    the imposition of absorption spectra in a camera sensor, the spectral output of your monitor's channels, and/or the absorption spectra of the dyes or pigments from your printer impose additional variables that would make it IMpossible to characterize the printed or displayed output as anything "true."​
     
  6. You are looking to see how the color metamerises when you change illumination types.

    First you will need to establish a base line to see if and what kind of metamerism there is under other illumination types.

    You should also check to see how choice of different color space profiles affects the rendering of the color as well.
     
  7. i am attempting to capture a color, i'll call it PMS 490, and how it is affected by its environment during a sunny and during an overcast day.​
    Why not just use your eyes. I have a Pantone Matching swatch book and PMS 490 coated looks to be a dark but somewhat rich and vibrant red brown and uncoated looks less rich and vibrant.
    This site and this forum is for photographers, not scientists. Pay grade is significantly different. If you want scientists to help you then hang out in color science forums.
    I think you're making this harder than it has to be. Just look with your eyes, photograph the PMS 490 swatch and edit it to make it match what you saw. That color is well within a camera's capabilities to reproduce even in sRGB gamut.
     
  8. Wow. There's some bitter folks up in this website. Try the decaf and get out of your house a little more. However, thank you Andrew Rodney and Ellis Vener for being professional and educated as to understand my question and give a straight answer to a fellow professional. I look forward to learning from you. For you guys, I'll elaborate.
    As I mentioned, I'm attempting to capture a color. A prescribed, solid swatch of color at a distance of about 20 feet printed on a prescribed substrate by a prescribed printer. I want to capture this color swatch on my Canon 5d2 with a 70-200 2.8-L, and have that swatch travel from capture to client viewing on my monitor and have "virtually" very little shift in color value and hue. I want to (A) shoot the first sample on a clear day with full sunshine, and then (B) shoot the same sample, same location, same camera/lens, on a cloudy day.
    When viewed on my monitor (assuming it is calibrated correctly for the environment that it sits), the two swatches can be seen side by side and the client can view them and understand how the sun and the lack of sun, in the environment that it physically sat during the moment of the shoot, how it affects that particular color, used on that particular substrate and printed from that particular printer.
    Basically, the only changing variable from one image to the other image is the effect of the sun behind the clouds, how the environment changes when that happens and what effect that will have on that particular color on that particular substrate.
    All of this is born from attempting to arrive at a single, "chosen" color on an established substrate that will later be used in a shoot, taken by a 5d2/70-200, 2.8-L, in a sunny and overcast-ish day by another photographer. Then I will receive that photo (raw), know exactly what color is used in the shoot and I will know, within very good reason, how the light/modified sunlight affects that color on my monitor.
     
  9. Keep in mind the numbers are going to very, perhaps tremendously depending on a large number of factors TO produce a visual match. That's the problem. The numbers your camera captures and creates and the numbers that might visually match on a display may have quite different numeric values. The same color in two color spaces will have different values.
    All of this is born from attempting to arrive at a single, "chosen" color on an established substrate that will later be used in a shoot, taken by a 5d2/70-200, 2.8-L, in a sunny and overcast-ish day by another photographer.​
    That's going to be very difficult and take lots of trial and error just to get color on substrate to match color under one illuminant let alone two. I think it will take a lot of trail and error, even if you did have a Spectrophotometer (and that device has it's own illuminant that comes into play).
    Then I will receive that photo (raw), know exactly what color is used in the shoot and I will know, within very good reason, how the light/modified sunlight affects that color on my monitor.​
    The raw photo doesn't have color per se. It has to be rendered to produce RGB values and again, the color space plays a role in the numbers provided. Those numbers go through the computer system, to the display and it's profile, more and differing numbers.
    Colorimetry, what you'd get from measuring a single color is about color perception. It is not about color appearance. The reason why viewing a print or some other kind of output is more valid than measuring it is because measurement is about comparing solid colors. Color appearance is about evaluating images and color in context which measurement devices can't provide. Are you only going to deal with a single solid color?
     
  10. Color is not a parameter, it is a perception, something constructed in the mind of the viewer, the subject of philosophers since ancient times.
    Color and color perception is a very complex topic. It stems from the fact that neither the human eye, film nor digital sensors register continuous color, rather a discrete set of three overlapping colors (four for birds and reptiles). Whereas the dyes for film and digital sensors are reasonably evenly distributed at red (650 nm), green (530 nm) and blue (450 nm), the human eye registers blue (450 nm), green (530 nm) and orange (580 nm). That gives us the ability to distinguish many shades of green, probably an evolutionary advantage at some point. When you approximate a smooth spectrum with three distinct colors, you get lumps and dips. Monitors and printers suffer from similar constraints.
    How do you make sure every viewer perceives the same color? Calibrate and standardize.
    Obviously your monitor must be calibrated, as well as the rest of your photographic data stream, including lighting the subject and the editing/viewing environment. Your client must also use a calibrated monitor and environment. Lighting must have a continuous spectrum, which rules out fluorescent and LED sources. Besides sunlight, incandescent lights (including halogen lights,which are simply hotter and whiter) and electronic flash are reasonably continous in nature.
    Ultimately, you and the client must consult and agree that the results match, or work toward that objective. An example of a critical match will be a company logo, usually composed of dyes or pigments with reflectance spectra unrelated to those in the photographic chain. For most other things a simple white balance will suffice, which is basically a red/blue balance on a white or neutral gray substrate. (With three primary colors, only two are variable and the third dependent.)
     
  11. Correct Andrew Rodney. I want to show my client what that single color does and how it is perceived under different situations.
    Let's say it is PMS 490 (using PMS because it is a standard baseline of colors in the printing industry). I can take that 490 swatch, shoot it in two lighting situations, then take swatch 491 (for simplicity, let's say that 491 is the warmer direction) and do the same thing, then take swatch 489 (again, let's say 489 is the cooler direction) and do the same thing. We would have 6 images. Each swatch shot in two situations.
    The client could then view these on my display (calibrated) and see, within reason, how each swatch shifts in the two situations. Then the client would have the right information to make an intelligent decision about which final swatch to choose for a target color for a their product to be printed/photographed in the two different lighting situations.
    The goal is to have the client be comfortable with a single hue/color, knowing that it WILL shift in different lighting situations, but having to choose one established hue.
    My thinking is to use a device that helps me record the lighting at the time of the shoot. Then later calibrate LR with a profile for that situation and equipment so that what is viewed is as close as possible to comparing apples (swatch at the shoot) to apples (swatch on my display) so that the client can make an educated decision.
     
  12. The goal is to have the client be comfortable with a single hue/color, knowing that it WILL shift in different lighting situations, but having to choose one established hue.​
    Understood but difficult to do for many, certainly all situations.
    The color you show can vary based on the illuminant of course. One can encounter metameric failure (what some incorrectly call metamerism which results in a match!). The display has to have a sufficient color gamut for said color you wish to display (the example you provide might fall into sRGB gamut in which case that's moot).
    It's easy to show colors shifting if that's your goal. Do you remember the old Kodak daylight checkers that did illustrate both metamerism and metameric failure based on whether it was under 'daylight' or not? That would be an idea way to illustrate these issues to the client in a simple fasion.
    My thinking is to use a device that helps me record the lighting at the time of the shoot.​
    A Spectrophotometer (it would actually be a spectroradiometer) could do that but here's the rub. The values it provides at the scene and the values you get from your raw converter will differ. It is important to understand what Edward just wrote about what color is; it's not a wavelength of light, it's something that happens deep in our brain. Numbers often fail to produce these perceptions for all kinds of reasons, some or which have been described here.
    Pleasing color and Colorimetrically correct color are often vastly different. You're asking for colorimetrically correct color, measuring the light at the scene etc. Those numbers don't necessarily match the values from other numbers, say emitting from a display when the two match. Or the numbers can match and the color appearance doesn't. It's a very slippery slope.
     
  13. john_thurston|5

    john_thurston|5 Freelance Photographer who Prints for others.

    I guess I fail to see the point of this elaborate exercise in Color Perception.

    What is this client actually buying, the ability to perceive color, the theories behind the fundamentals of Colorimetry, or picking a color who's apparent perceived color does or does not change under different lighting conditions?

    The capture of this color under different conditions by a digital device doesn't amount to a hill of beans since our eyes and brain will show it to look exactly the same, and in fact we can never see it under two different lighting conditions simultaneously in anything other than a forced experimental situation anyway.

    Are you trying to convince them of, or refute, the idea that color A is perceived exactly the same to our eyes under various lighting conditions, but not rendered exactly the same under different lighting conditions by digital capture, display and print devices?
    Our eyes do an amazing job at rapidly adjusting to various color temperature light, and perceiving those objects to be the same color under those different conditions.
    On the other hand as Andrew Rodney has pointed out, Digital color capture is not really all that smart, and require endless tinkering on our parts to objectively capture, measure, and display almost insignificant differences, that thank goodness, our eyes tell us are exactly the same.
     
  14. You wanted a "scientific" response, to wit:
    what are your thoughts on the most scientific way to go about achieving this and arriving a a true color as it is affected by the sun and overcast?​
    So you got my response, as a scientist with a Ph.D. whose area of study is (or at least was) sensory physiology, but you didn't like it and dismissed it thusly:
    Wow. There's some bitter folks up in this website. Try the decaf and get out of your house a little more.​
    On this forum, you should endeavor to be a bit less RUDE, especially when someone is trying to answer the question you asked.
     
  15. When viewed on my monitor (assuming it is calibrated correctly for the environment that it sits), the two swatches can be seen side by side and the client can view them and understand how the sun and the lack of sun, in the environment that it physically sat during the moment of the shoot, how it affects that particular color, used on that particular substrate and printed from that particular printer.
    Basically, the only changing variable from one image to the other image is the effect of the sun behind the clouds, how the environment changes when that happens and what effect that will have on that particular color on that particular substrate.​
    Again, a camera does not record light the same way a human sees with their eyes which is why you are going to have to help emulate the effect of how two different lights on one swatch of color affect its appearance by editing the effect on your calibrated display so your clients can get the idea.
    I do this all the time and have gotten quite good at it. I demonstrated it on this thread...
    http://www.photo.net/photography-lighting-equipment-techniques-forum/00dNjR
     
  16. Thanks to everyone for the constructive input. Will follow up and let you know how it goes. Cheers
     
  17. I've read quite a few scientific white papers on color science looking for whether a spectrophotometer can tell me why I see the green leaf color effect on the right but my digital sensor can't shown on the left (see below)
    Not one tech document even touched upon this subject or told me if any scientific instrument could reproduce that effect of sunset light on greens the way I saw it, the version on the right.
    Nor do I find any images online that takes notice of this effect in order to reproduce it. Most discussions on the subject of making their cameras show what they see complain about the software limiting them instead of learning how their software tools work.
    00dOwK-557704784.jpg
     
  18. As I read what you're saying I think the Colorchecker would be overly complex and counterproductive. By profiling and calibrating the camera for each of the lighting conditions you will probably remove much of the subjective differences in color. I would instead shoot sRGB or AdobeRGB mode on the camera (maybe use the colorchecker to profile the camera once under controlled, neutral, lighting and use it for all other situations) and custom white-balance for each scene (sunlight or sunset you might balance separately for sunlight and skylight). That way your image should reflect the kind of subjective differences the eye might perceive in each condition rather than just metamerism and other smaller effects.
     
  19. It's even less work than that. You'll only need a few such profiles, they are illuminant specific. So one for daylight, one for tungsten, one for Fluorescent and then whatever odd illuminant you run into. A single daylight profile can be used all day, every day:
    In this 30 minute video, we’ll look into the creation and use of DNG camera profiles in three raw converters. The video covers:
    What are DNG camera profiles, how do they differ from ICC camera profiles.
    Misconceptions about DNG camera profiles.
    Just when, and why do you need to build custom DNG camera profiles?
    How to build custom DNG camera profiles using the X-rite Passport software.
    The role of various illuminants on camera sensors and DNG camera profiles.
    Dual Illuminant DNG camera profiles.
    Examples of usage of DNG camera profiles in Lightroom, ACR, and Iridient Developer.
    Low Rez (YouTube):
    http://youtu.be/_fikTm8XIt4
    High Rez (download):
    http://www.digitaldog.net/files/DNG%20Camera%20profile%20video.mov
     
  20. It seems to me, start with shooting in RAW. Then use a color meter to measure the Kelvin temperature of your light source, the sun on that day, at that time, and same with an artificial source. Then in RAW conversion enter in the meter's Kelvin temperature into the color temperature setting in the software. Then output a tiff file in the largest color space possible. That ought to get you close, at least within the accuracy of your camera and lens.
     
  21. Why dwell on issues of perception? For starters, there is no clear correlation between the color of an object, how it is recorded on film or digital, and how a viewer perceives it. No one can say with certainty that you perceive a color the same way I perceive it.
    Digressing from the philosophical aspect, which I believe underlies the problem, I can say, with certainty, that I have never been able to match the results of two different digital cameras, much less the results of diverse processes like film, scanning and printing of the same subject. For example, in a recent comparison of a digital Hasselblad and a Sony A7 of the same scene, same day, taken a few seconds apart, I could not match all of the major colors. In the end, I did the best I could on foliage, leaving the ostensibly blue sky to vary. On another occasion, I could not match the colors of a 35mm slide scanned in a Nikon 8000 and copied with a digital camera. Theoretically you can match two colors. In practice, be happy with one.
    For what it's worth, I have all the usual tools - a reflective/emissive spectrophotometer (i-One), various standard color charts, and software tools including Lightroom, Photoshop, Premiere Pro (2015) for video, and a color managed/calibrated work flow.
     
  22. Interesting, I once did a similar thing with an analog camera and slide film and I didn't worry about color rendering at the time (just had to choose an appropriate film). A good example of how things can get more complicated when digitalised...many more variables today.
     

Share This Page