Jump to content

why can't take the original color of the item


mayli

Recommended Posts

What are you photographing? Are you setting white balance to fit the lighting conditions? Are you editing your images, and with what software? More specific information would be helpful in answering your question.
Link to comment
Share on other sites

The problem: We have color theory but all the filters we are putting upon our sensors are a technical compromise and not perfect. - The inks used for printing out images are even worse.

If you are unlucky you even have far from ideal software between your image file and your monitor.

There is also no perfect, technically ideal film as an alternative. At best a best suited lesser evil to choose. And at least as many pitfalls awaiting you in a film workflow.

Things like auto white balance tend to do a great job but can get fooled too. - Take a white sheet of paper and shoot a box of crayons one after another placed on that sheet with AWB on. You'll most likely see a color shift on the paper, when you put pictures of differently colored subjects under the same light side by side.

 

Try to shoot RAW. Utilize something like XRite color checker and related software, to profile your camera. Get out of AWB...

But the problem as I described in my first line will stay, just become less boldly visible.

 

IMHO photography determined: "Somebody will end with grey hair." That could either be me, trying to get my subjects' colors right or my subjects, when I approach them BW.

Link to comment
Share on other sites

Sadly, modern digital cameras are designed with very narrow-cut (small bandwidth) red, green and blue separation filters built into the sensor. This is great for saturation - colour intensity - but not so good when it comes to accurate colour representation of spectral colours; such as a rainbow, colours from a glass prism, or those colours reflected from certain flowers or man-made dyes.

 

The cult of tri-colour theory has been taken to extreme, and while it works up to a point, it's not perfect. At least not in the form of the 'brick wall' RGB filters currently being used in camera Bayer matrix sensors.

 

Having said that: There's no reason why you shouldn't be able to achieve a reasonable representation of, say, 90% of real-world pigments and dyes. Provided the illuminating light source has a continuous and balanced spectrum, the White Balance of the camera is set properly, and everything in the reproduction chain is properly calibrated and of a good standard.

 

Shooting RAW will not help if, for example, CFL lamps with a poor output spectrum are used to illuminate the subject, or a badly calibrated or low-quality monitor is used to view the images.

Edited by rodeo_joe|1
  • Like 2
Link to comment
Share on other sites

yes

See all of the various posts above. I don't have anything additional to add, except to say there's no camera/software/monitor combination that exactly replicates the human eye/brain combination. So, you have to work with the tools available to obtain the results you want. There's simply no exact, 1:1 correspondence.

Link to comment
Share on other sites

What you desire is a faithful image, i.e. one that looks exactly the same as what you observed when the shutter was clicked. Photo engineers have strived to do exactly that for about 200 years. To date this has never been achieved.

 

Consider --- On a typical sonny day, the difference between the darkest and brightest object exceeds 2000:1. In other words, if we measure a black automobile tire in shadow and the gleaming chrome trim, the highlights off the chrome are over 2000 times brighter. Should we image this scene, the best we can do on film or digital image is about 256:1. If we could accurately duplicate this vista on a theater or computer screen, you would want to don sunglasses for comfort.

 

As to color matching – We are forced to make three separate records of the subject, one red, one green, and one blue. Using a film camera, we can make three discrete exposures on three dissimilar films, or we can use a film consisting of three specialized layers. Likewise, the digital camera actually makes three juxtaposed images to obtain three distinct images, one red, one green, and one blue.

 

What I am trying to tell you is – The engineering deed needed to make a faithful image is extreme; we have not yet met this goal. What we have achieved is remarkable. Perhaps, in the future we will come closer; perhaps it will be You that figures out how to make a faithful image.

  • Like 3
Link to comment
Share on other sites

The cult of tri-colour theory has been taken to extreme, and while it works up to a point, it's not perfect. At least not in the form of the 'brick wall' RGB filters currently being used in camera Bayer matrix sensors.

 

Not too terribly long ago, I got to see a demonstration of a a very early Phase 1 digital back mounted to a Sinar 4x5 view camera(although it could also have been used on a handful of other cameras with the right adapter). This particular camera had been in the marketing department of a local, large company, although I have no idea when they quit using it. It probably photographed a whole lot of bourbon bottles...

 

In any case, this particular back does not use a Bayer array. Rather, it makes a color image by taking three separate exposures through filters that are shifted in front of the sensor. Discounting the benefit of nearly 25 years of sensor development(I've used Bayer Kodaks that were not much newer) the color rendition is certainly different from what I'm accustomed to with a modern Bayer sensor. It was more like shooting Kodak 160NC or similar film as opposed to the bright, crisp colors my modern Nikons give. I can see why it might have stuck around in the marketing department past its prime even if-on paper-a modern camera is in every way better.

 

Along those same lines, a film scan certainly looks different from a digital original. I'm not saying that as a good or bad thing-film has its own failings in terms of accurate color reproduction, and back in the day we would select the appropriate film stock for what we were photographing, but also take the good with the bad. I wouldn't photograph a vivid landscape scene with 160NC(or even the modern Portra 160) and wouldn't photograph a wedding with Velvia. At least with something like my D800, I can tweak that from shot to shot(or even after the fact in RAW) but at the same time if my light is bad there's only so much I can do.

  • Like 1
Link to comment
Share on other sites

When shooting, I often can't take the original color of the item. What is the problem and how to solve it?

 

What is the light source when you have problems? The sun? The shade under a tree? Are you indoors or shooting at night with all artificial light?

Link to comment
Share on other sites

I'm surprised that no one mentioned the role of human perception. When we look at, say, a green book, in different kinds of light--under sunshine, cloudy skies, old-style incandescent lights, fluorescent, etc., it looks like the same green book. We notice the light is different, but our brains adjust and, to an extent, "see" the same color green. Under many electric lights that work with alternating current, the color actually changes 50 or 60 times a second, but we don't notice. A camera captures the color at one instant and has a more limited ability to adjust to the light source. Digital cameras have automatic white balance that attempts to adjust for the ambient light, but that doesn't work perfectly, and most allow the photographer to make that adjustment.

 

When we look at a photograph, on a screen or printed, our brains adjust to the light of the room we are in, but not the light in which the photograph was captured. In a sense, our brains play a trick on us, but it's a good trick. I know that if I noticed the light in a room changing every 50 or 60 seconds, it would drive me crazy.

  • Like 2
Link to comment
Share on other sites

David commented on perception when he mentioned the "eye/brain combination" but perception isn't really the issue here since the best that we can do is provide the "perception" that the colors match when done well.

 

"Under many electric lights that work with alternating current, the color actually changes 50 or 60 times a second, but we don't notice. A camera captures the color at one instant and has a more limited ability to adjust to the light source.:

 

I don't believe that this is true at all . . . If it were, photos taken under the lights, fractions of a second apart, would appear to be different colors. Can you give an example where this occurs?

Link to comment
Share on other sites

Under many electric lights that work with alternating current, the color actually changes 50 or 60 times a second

Like Ed, I'm not sure this is entirely accurate. Incandescent light filaments do not react to changes in current due to the AC effect to create a meaningful or measurable difference (due to thermal lag). The same, to a lesser degree, for fluorescent, though, like LED's they can exhibit a very narrow spectra. LED's, particularly those that dim, can experience an interference pattern between their on/off cycle and the exposure time of a camera, but they do not tend to change temperature substantially across these cycles. (Dimmable LED's, at least the large, commercial variety, don't actually change their brightness, but cycle on/off at variable rates that have the same visual effect as actually changing the light output.) There are also HED-type fixtures that experience on/off cycling, which might impact their color temperatures, but, like incandescents, the thermal lag built into the emitter would mostly overcome the light temperature effects.

 

By the by: I've seen some threads where pro photog's were frustrated taking photos of weddings and other events under the new-fangled LED fixtures. They had to carefully select shutter speed so as to overlap several on/off cycles, with the concomitant impact of being forced to use slower shutter speeds than desired. The other major photo problem with LED's is their potentially very narrow color band. Our eyes are pretty good at adjusting what we see to accommodate the available light, but digital cameras far less so. Unless the LED's (or fluorescents, for that matter) have a high Color Rendering Index (CRI), there might not be sufficient spectrum available for a digital sensor to capture all the apparent colors.

  • Like 1
Link to comment
Share on other sites

The same, to a lesser degree, for fluorescent,

A slight correction: Fluorescents don't experience much in the way of color temperature change, but they do exhibit a significant flicker due to the AC current. This is why many camera's include an "anti-banding" function that adjusts shutter motions to avoid the banding effects that come when there is a harmonic correspondence between the shutter movement and the light flicker. (That's my best attempt at putting this in layman's terms.)

Link to comment
Share on other sites

You can get close by carefully color balance, use light source that have good spectral distribution (sun, incandescent lamp not flourescent, sodium vapor or mixed). Have your monitor calibrated. If printing calibrate and profile your printer. After all it would be close but never perfect.
Link to comment
Share on other sites

Should we image this scene, the best we can do on film or digital image is about 256:1.

Not strictly true. While 8 bit JPEG images can only represent 255 different levels of greyscale brightness (with zero disallowed for mathematical reasons), those discrete levels don't dictate the contrast or brightness range of the displayed image.

 

We might output the same image as a print - limited to about 100:1 brightness range - or on a high-quality monitor that could achieve a 1000:1 brightness range when viewed in dim ambient light. (Maybe even more in a totally unlit black-painted room or booth - although maker's claims of thousands:1 screen contrast are grossly exaggerated!)

 

OTOH, the print's highlights will reflect close to 100,000 Lux if viewed in open sunlight, while the more contrasty monitor will have highlights that only output a few hundred Lux. So brightness, contrast, and the number of levels displayed in between 'black' and 'white', have almost nothing to do with each other. It's all relative; and the eye is very good at accommodating across a wide range of ambient brightness, and of colour temperature.

 

Mayli seems to be a person of few words. A little more explanation of where her system is failing her wouldn't come amiss.

Edited by rodeo_joe|1
  • Like 1
Link to comment
Share on other sites

Not strictly true. While 8 bit JPEG images can only represent 255 different levels of greyscale brightness (with zero disallowed for mathematical reasons), those discrete levels don't dictate the contrast or brightness range of the displayed image

 

We might output the same image as a print - limited to about 100:1 brightness range - or on a high-quality monitor that could achieve a 1000:1 brightness range when viewed in dim ambient light. (Maybe even more in a totally unlit black-painted room or booth - although maker's claims of thousands:1 screen contrast are grossly exaggerated!)

 

OTOH, the print's highlights will reflect close to 100,000 Lux if viewed in open sunlight, while the more contrasty monitor will have highlights that only output a few hundred Lux. So brightness, contrast, and the number of levels displayed in between 'black' and 'white', have almost nothing to do with each other. It's all relative; and the eye is very good at accommodating across a wide range of ambient brightness, and of colour temperature.

 

Mayli seems to be a person of few words. A little more explanation of where her system is failing her wouldn't come amiss.

I am referring to the range of f-stops that can be imaged. Each f-stop = a 2X incremental delta. Thus 256:1 = 8 f-stop range. Some methods can capture a 10 f-stop range however most often the range is about 8 f-stops.

Link to comment
Share on other sites

I am referring to the range of f-stops that can be imaged. Each f-stop = a 2X incremental delta. Thus 256:1 = 8 f-stop range. Some methods can capture a 10 f-stop range however most often the range is about 8 f-stops.

 

If you talk about 8 bit RAW file (the RAW of most current cameras are 14 bit). When the RAW data converted to 8 bit image files it doesn't do it in linear fashion but rather use a gamma of 2.2 and that would increase the range. However, the gradation in the shadow is poorer and there is better gradation in the high light.

Link to comment
Share on other sites

because pictures are digital (assuming this being your case) they don't really exist, they are instead rendered each time we view them. The color of the image will be therefore depended upon the screen used for viewing, and this can vary widely. Professional photographers/editors/publishers go through great effort to *calibrate* their viewing devices to match the desired final output for the images (print, web, etc...). This is a process that requires certain tools and procedures to be properly done..
Link to comment
Share on other sites

I am referring to the range of f-stops that can be imaged. Each f-stop = a 2X incremental delta. Thus 256:1 = 8 f-stop range. Some methods can capture a 10 f-stop range however most often the range is about 8 f-stops.

That 7 or 8 stop SBR may have held good for slide film, but colour negative film could easily encompass a 10 stop SBR. While a modern digital camera can manage 12 stops without too much trouble. Lens and body flare being the limiting factor, rather than that of the sensor.

 

However, the original subject brightness range is almost completely disconnected from the contrast (density or brightness range) of the slide, negative, screen, print or whatever other output medium is chosen. For example: Slide film, paradoxically, has a wider density range - of 11 to 12 stops - than the SBR it can handle. OTOH negative film can squeeze 10 stops of SBR into only a 4 stop density range. And as BeBu points out, the 14 digital bits of a DSLR or MILC can be almost directly converted to a 14 stop capture range.

 

None of this expansion or contraction of contrast really affects the perceived output colour, which depends more on the relative amounts of red, green and blue stimuli presented to the eye.

Edited by rodeo_joe|1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...