Jump to content

EyeOne Display calibration of Apple monitors


Recommended Posts

I found this Q&A on EyeOne's web site. Has anyone tried this? What do you think?

Does it apply to Cinema Displays?

 

Question

When I calibrate my Apple display to warmer white point (i.e. D50), it

becomes somewhat too dark and slightly yellowish. Is it possible to get a better

result?

Answer

Reason: In general, Apple Displays do not allow you to adjust the white point

in a hardware mode (by on-screen display buttons on the monitor) to a desired

value, e.g. 5000K. So this needs to be done using software by making a gamma

curve correction, or lookup table (LUT), that is downloaded to the graphic card

(via LUT). However, when you do this correction, your luminance will be lost,

which causes the muddy yellowish result

 

The workaround:

1. Set the Brightness on the monitor to 100%.

2. Start your calibration software (Eye-One Match or PM5).

3. Define the target White Point (e.g. 5000K), the desired Gamma and - very

important - a target Luminance value, e.g. 120 cd/m2.

4. Follow the instructions of the calibration process but ?bypass or skip? the

Brightness, Contrast, and RGB Adjustment. Adjust the Luminance using the

Brightness button on the monitor!

Link to comment
Share on other sites

But your monitor isn't your slide tableandyour scan isn't your slide. I think you'll be happier

with 6500K, a 2.2 gamma and the luminance valuefor your type of monitor, but then what do

I know? My prints match what I see on my monitor (within 95% on the first print) and also

match what clients see on their calibrated and profiled monitors.

Link to comment
Share on other sites

Sorry for not being clear enough. My question has nothing to do with setting the monitor to 5000K. I am just wondering why do I have to do step 1 above before starting the calibration process. The recommended process applies no matter whether I'm setting the monitor to 5000K or 6500K.

 

As far as setting a monitor to 5000K instead of 6500K, it has to do with matching my monitor to my slides viewed from a 5000K light table. That allows me to match my scanned film image to the original slide. If you are not trying to match your monitor image to a 5000K source, setting the monitor to 6500K is OK.

Link to comment
Share on other sites

Michael, I use a Cinema Display and an Eye One calibrator. (I calibrate to a white point of

5500K.) I don't really understand the Q&A you have quoted. First they say set Brightness to

100%, then they say don't adjust the Brightness, then they say adjust the Brightness. Huh?

Link to comment
Share on other sites

Elliot, I think what they mean is before starting up your EyeOne calibration software, turn your monitor brightness to maximum. When you're running the EyeOne advance calibration, skip the adjustment steps (just click NEXT) when it prompts you for brightness, contrast and RGB adjustments. When you get to the luminance adjustment step, use the monitor brightness buttons to adjust monitor brightness so you will be within the luminance green zone (around 120 cd/m2).
Link to comment
Share on other sites

Michael,

 

I hear very contradictory opinions on calibrating Cinema displays to 5000K. Some say "no problem" others says it can't be done. The lack of adjustments to get the monitor as close as possible before generating a profile (which is really the calibration part of the "calibration" process, the other being profile generation) certainly doesn't help and forces major corrections through the video card, resulting in a significant reduction of remaining levels which can cause posterization, particularly in blue sky and similar areas.

 

On the subject of matching your monitor's color temperature to the lighting, I agree with you 100%. If I use what most people swear by - monitor at 6500K, lights at 5000K - I see an horrendous color mismatch with the monitor way to bluish versus the print illuminated by the 5000K lighting. Even professionals like Bruce Fraser - may he rest in peace - kind of admitted the problem exist and advised to intentionally move the print view area away from the computer so you couldn't compare the two! Whenever I speak out in favor of color temperature matching, people wonder how I can be so misguided! Can't they see the mismatch that I and apparently you see?

Link to comment
Share on other sites

Frans, I agree with you on this one: the color temperature of the screen should preferably

match your viewing light's color temperature. So you can do two things: match your screen

to the lamps you have, or match your lamps to the screen. For the latter you could use

conversion filters, like Lee or Rosco gels. As for your other remarks: how do you think an

LCD can be preset to get closer to your target settings? Most LCD's cannot be altered by

any other way than by changing the substraction level of the RGB pixels. So you might as

well do the entire targeting through the video card. There is no need for double "profiling"

your screen. With Apple screens you do have hardware setting of the lamp brightness

though (by means of an electronic dimmer), so that should be set manually. The

theoretically ideal screen would have a light source that can be color- and brightness

adjusted, apart from the LCD panel.

Link to comment
Share on other sites

Erik,

 

The better LCD monitors let you - in addition to adjusting the backlighting brightness - set the red, green and blue levels independently from the video card, thus preserving the full 8 bits per channel for profiling and normal operation. The even better LCD monitors take the video card totally out of the loop and do the calibration and profiling within the monitor, leaving all 8 bits per channel available for normal operation. This approach is used for instance in the NEC 90 Series monitors with the DCC/CI interface.

Link to comment
Share on other sites

Frans,

 

Did you ever see one of these LCD's in reality? Or do you even own a profiling device

yourself? As long as a screen uses its pixels to adjust the basic RGB settings there is

NOTHING to gain in setting those values. Adjusting the RGB channels separate from the

video card actually limits the range of pixel brightness levels available to that card. There

can only be some gain where the internal video card of the screen has more bits for its

image rendering than the video card in the computer. This allows for higher color

precision, even though the latitude of the pixels does not improve. A real plus would be if

a screen had independently adjustable light sources for the separate RGB channels, so that

the native white could actually be set. With modern LED technology this is about feasible.

It would give a kind of control that one finds in a CRT as well, where basic RGB can be set

separately because each color has its own electron gun. And then, yes, THEN you would

have all bits available for image rendering over the full range of levels each pixel can be

set to.

Link to comment
Share on other sites

Erik,

 

I really could do with a little bit less of a hostile attitude. I've been using calibrators for 7 years, if that answers your kind question. Calibrating a monitor by its very nature limits its range, as does subsequent profiling. However, where those adjustments occur has an impact on the available levels through the video card. If all adjustments are made through the video card, then there are less than 8 bits remaining for normal operation of the monitor. If calibration is done in the monitor, then more levels remain for normal operation and if the profiling is also done in the monitor, then ALL 8 bits are available for normal operation. Of course the monitor LUT needs to be bigger than 8 bits, otherwise there would be no improvement. The better monitors with internal calibration/profiling use 10 bits, the best, like the NEC models I refered to, use 12 bits.

Link to comment
Share on other sites

I need some education here. Are you saying the NEC 90 series LCD monitors (e.g. LCD2490wuxi) are better than the Cinema Display? If so, in what way? The specs such as resolution, Cd/m2, contrast are all about the same. If I need to use a EyeOne calibrator to calibrate the monitor, would one monitor still out performs the other in the area of color accuracy and smoothness. The other thing is what's the significance of the 12bit LUT and having x bit left for certain operation?

 

I'm debating whether I want to upgrade my Cinema Display. Thanks.

Link to comment
Share on other sites

Michael,

 

Yes, I believe the NEC 90 Series IPS monitors (identified by the letter i in their product number) are superior. The IPS technology provides the best possible color accuracy both straight-on and off-axis. The Apple Cinema Displays are not IPS technology. These NEC monitors use the DDC/CI interface and in-monitor 12 bit correction so no levels are lost due to calibration and profiling, reducing the possibility of posterization in areas like blue sky, etc. The Cinema displays don't have this. Also, the NEC LCD2690WUXi covers 93% of the Adobe RGB (1998) color space because it uses a wide-gamut CCFL backlighting system. This is even better than most CRT monitors. Most other LCD monitors - with the exception of some models costing over $4000 - cover about 70% of the Adobe RGB (1998) color space. In addition, some people report that Cinema displays cannot be calibrated satisfactorily down to 5000K or so while others report no problems in this area.

 

For color-critical work in a reasonably well thought-out work area where outside light is well-controlled, 200cd/m^2 is more than enough brightness; you probably need to run an LCD monitor at way less brightness than that, depending on your digital darkroom lighting brightness. As long as your contrast ratio is at least 600, you should be fine. It would be very hard to actually see the difference between such a value and one is significantly higher.

Link to comment
Share on other sites

They are ISP-S displays and considered one of the best in the

business for predictable color output to commercial presses.

They are SWOP certified. I'ld leave it at native because

adaptation will kick in before you can notice color shifts between

what appears on light table and on the display.

 

That kind of precision really isn't worth the hassle and reduction

to display gamut which is already at sRGB specs at the native

WP-the best anyone can get from a display.

 

As for doing step one-increasing brightness to 100%, it might

have something to do with clearing of gamma LUTs during the

section within i1 Match where you see the screen change in

gamma and color cast. This happens on all display calibration

packages.

 

It has to because there's always a gamma based profile loaded

in the graphics card. Most can't tell when it happens even when

they don't calibrate but it sometimes can be seen when they first

start up the computer and notice the screen darkens and

lightens.

 

Maxing out the brightness as the first step probably has more to

do with getting the eyes used to shift in gamma when the LUTs

clear. Just my guess.

Link to comment
Share on other sites

Just my .02 cent....

 

I found that many people are searching way too far all the technical stuff out there. Of

course it could be nice to know for your personal knowledge all this bla bla, im agree. BUT

why dont you in the end just buy the best LCD your money could buy, with a good

calibration device, and do some good retouching or your calibrated screen. Do you realy

need to know all the LUT, cd/m2.4, candela, and all that stuff to produce good images? I

am a professional photoretoucher, that love tech spec, once in a while, but i prefer to put

my energy on my retouching skill. That say if you read around, this forum or the internet,

many are agree that you should not calibrate your screen to D50 or 5000K, 6500K should

be closer. I calibrate my Apple Cinema display at 6500K and 2.2 gamma. the result? i had

never been disapointed when i print on my 4800, and the result i saw when it print in the

magazine are pretty similar. in the end i think is the result that matter. to all of you....just

my personal oppinion : )

Link to comment
Share on other sites

Tim, I would like to follow up on your following comment:

 

"That kind of precision really isn't worth the hassle and reduction to display gamut which is already at sRGB specs at the native WP-the best anyone can get from a display."

 

My understanding is the sRGB space is based on the display capability of a common/average monitor, not the best display available. Monitor display capability varies a big deal, for example, NEC advertises their 90 series monitors to be capable of reproducing ~60% to ~90% of Adobe RGB color space, depending on models. If you have an average or below average monitor, won't you have problem soft proofing your images? When all your images are manipulated in Adobe RGB space, and you want to see how they will print on certain printers, won't you have a problem differentiating the different proofs if your monitor is the weakest link in the chain?

 

Another thing is why is setting the color temperature to 5000K reduces gamut? My monitor is set to 5000K and 2.2 gamma. I thought those are independent parameters?

Link to comment
Share on other sites

Michael,

 

I'm basing my statements on a Karl Lang article about displays

that claim to be close to AdobeRGB. Karl Lang was the

designer/engineer behind some of the best CRT's for

commercial press usage like the Radius PressView which the

ColormatchRGB color space is based on. Do a search for this

article. It states that even though LCD's can claim 60-90% of the

AdobeRGB gamut based on examination of gamut plots, a

computer's 8bit video system can't sufficiently support it. The

math behind it bares this out.

 

And I use sRGB as a general term to describe most displays like

my CRT which shows XYZ and gamma build numbers very close

to the sRGB space as examined in PS's Color

Settings>CustomRGB after loading my EyeOne Display profile

as working space. It doesn't say which colors it's capable of

rendering or how it will render them. It's just a color model

roughly describing shape and size.

 

Examine both the three corners of the triangle shaped gamut

plot for your calibrated display in i1Match when targeted for

5000K and native. When I start pushing the white point color tint

and brightness and contrast out of 6500K, 2.2 gamma ranges,

the corners of the gamut plot for my CRT will move inward

slightly mainly in the green.

 

The gamut of any display is determined by how bright and

contrasty=(wider dynamic range) you can calibrate and push the

display and still maintain RGBCMY purity and intensity with no

banding or hue shifts in gradients made up of these purities.

CRT's have phosphor pigments to make up its purities, LCD's

don't have phosphors. They have to emulate a CRT's response

by increasing the backlight intensity shining through colored

filters much like stained glass in a cathedral.

 

Another thing is 5000K and 6500K aren't exact measurements of

color temp. On my display I've made 6500K go from pinkish blue

to greenish blue and 5000K from orangish yellow to cream and

the EyeOne still measured them as 6500K and 5000K

respectively. I have four 5000K Sunshine flotube viewing lights

two that lean slightly toward a subtle maroon cast the other two

dull green, but for me to see this they need to be right next to

each other.

 

What color tint appearance and brightness level is your light

table set to because those two go hand in hand in determining

how fast your eyes adapt to differences in color temps between

viewing devices and environments. Are you sure your light table

is 5000K and is the same brightness as your display?

 

My CRT at 6500K after adaptation looks more neutral next to my

5000K flotubes and that's what I need to EDIT in. I just make a

mental note that my prints on first glance under these lights are

going to have a slight yellowish cream cast but adaptation soon

fixes that because the brightness level of the reflected light on

my prints matches the brightness level of my CRT.

 

And any display can pretty much cover the gamut of an output

device like a printer which uses inks while a display uses lights

and intense color purity's. Soft Proofing to these devices

shouldn't be a problem.

Link to comment
Share on other sites

I think you really have to "calibrate" your Apple display to Native/Native. That means just measure it and that's it. End of story. Everything else has negative effects on the output quality.

 

Obviously set it to a reasonable brightness (luminance) of 120... Plus-minus 20 depending on your lighting conditions.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...