Jump to content

Recommended Posts

You are parsing words. Calibration and adjusting a monitor consists of two parts. If possible, the monitor brightness and contrast are adjusted, using a standard (e.g., a photometer).. Ths same device is then used to read color patches (another standard) to produce a profile which conforms the output of the screen to the expected color of the patches. The profile is a digital "adjustment", much as you would construct a chart to standardize a spectrophotometer or chomatograph in the laboratory. In all cases, traceable standards are used to produce the points on those curves.

 

Not surprisingly, I have met the same resistance to good measurement practice from lab technicians and a few PhD's. In the pharmaceutical industy, NIST definitions and practices are not optional.

Link to comment
Share on other sites

You are parsing words.

No, I'm using the correct technical language. And correcting misinformation along the way.

Calibration and adjusting a monitor consists of two parts. If possible, the monitor brightness and contrast are adjusted, using a standard (e.g., a photometer).

No, it's one part. Calibration to produce a desired behavior. That's it. There is NO standard! I don’t know if you are purposely trying not to understand this, or if you are really struggling with it. There is no standard per se for the instruments. They absolutely are not created equally nor produce the same measurements! I can assure you a $5K spectroradiometer and a $500 Colorimeter do not adhere to any standards. They should but that's a different topic. Even if they did, you fail to accept whichever you use, there is NO STANDARD in targeting the calibration; it varies depending on the needs. I've explained why repeatedly.

There is absolutely no standard to the number of patches read. Or the colors. That's just an absolutely false idea. Each package has differing patch targets, some, allow custom targets to be created. I don't know where you get the ideas you're posting, I can and will show you that differing software packages can and will allow the creation of all kinds of color targets to calibrate a display. AND the patches themselves don't calibrate; they are simply used to measure there emissive conditions of the calibration aim points you pick which again, hugely differ depending on the software used.

If possible, the monitor brightness and contrast are adjusted, using a standard (e.g., a photometer).

The correct terms used for these instruments are spectroradiometer or Colorimeter. Correct, not parsing.

Ths same device is then used to read color patches (another standard) to produce a profile which conforms the output of the screen to the expected color of the patches.

Again no, no standard. The profile is simply a reflection of device behavior. This is color management 101. Please attempt to study how this stuff actually works before posting again.

No, the profile isn't a physical adjustment. It simply defines device behavior. You are rather confused about the vast differences between calibration and profiling. A display profile can load a LUT into the video system, less than ideal but its role in color managed software is to define device behavior. And at no such standards! With no info about color accuracy at this point in the process.

Not surprisingly, I have met the same resistance to good measurement practice from lab technicians and a few PhD's. In the pharmaceutical industy, NIST definitions and practices are not optional.

You're not getting resistance, you're being corrected because much of what you've written is technically wrong. You can believe it's parsing of words, I'm not going to go out of my way to convince you otherwise but our respective readers here can examine who's got the experience and facts straight and who doesn’t. I've provided a number of points that easily dismiss technically what you've written. Enough said I think. :D

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

I understand perfectly how it works, I analyzed data, constructed LUT's and created interpolation curves since you were still in diapers. Every niche industry wants to create its own jargon, ignoring existing technology. What I'm saying is consistent with measurement science, of which NIST is arguably the best repository.

 

A monitor profile is an LUT (i.e., multi-dimensional correction factor), not itself a standard, but created using traceable calibration standards for measurement and color. While some these standards derive from physical entities others (color, color space, etc) are equally serviceable consensus standards within the industry. A photometric device is ultimately traceable to basic physical properties of light and matter.

 

"LUT" stands for Look Up Table - a matrix of constants. They can be used directly, but more effectively expressed as polynomial interpolation curves. Monitor profiles derive from LUT's, as do video grading tools called "3D LUTs", which are used somewhat differently.

Link to comment
Share on other sites

I understand perfectly how it works, I analyzed data, constructed LUT's and created interpolation curves since you were still in diapers.

Sure you were (having no idea my age either). IF you know how this works as you state, then write how it works correctly with respect to calibration, color accuracy and actual standards. You haven’t. Since you haven't, what you wrote that's factually iffy was corrected.

The concepts of color accuracy are well understood by some and again, how it's correctly defined was outlined in the video.

Standards are well understood by some, and again, none are used for calibration of a display. You've failed to express any such standards for such calibration because I don't believe they exist or you would have. That there are dozens upon dozens of possible calibration aim points that are possible, as illustrated in software products that provide display calibration, it's simply silly to suggest there are some magic standards for display calibration (cause there are not).

 

There are tools used to calibrate and profile a display and their names and uses are well understood by some. The correct terms were already provided when the wrong one's were used prior.

I am well aware of standards organizations such as NIST, I'm a member of several such organizations! I'm an acting member of such an group who's sole purpose is undertaking display standards! That would be the IEC TC110 WG10 working group. And a member of the ICC. So I'm telling you your comments about calibration standards for displays is fishy.

 

Now you want to tell me what LUT stands for, or that you constructed them while I was in diapers only illustrates your writings here are not to be taken too seriously, certainly in terms of display calibration, standards and color accuracy. Which is fine, I had that idea a few posts ago and stated them without any parsing:

 

You're not getting resistance, you're being corrected because much of what you've written is technically wrong. You can believe it's parsing of words, I'm not going to go out of my way to convince you otherwise but our respective readers here can examine who's got the experience and facts straight and who doesn’t. I've provided a number of points that easily dismiss technically what you've written. Enough said I think. :D

 

Enough said I think, again.

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

Sorry to interrupt you both’ I don’t understand 1 milligram ...

 

May I just ask a little question? Will I see a difference on both Of my screens using a d810 instead of my D610?

At 1:1 maybe. Lots of variables with how the raw is rendered taking up a lot of said variables.

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

On the tv monitor I use JPEGs , it doesn’t read raw

No but the JPEG came from a raw. How the raw was rendered plays a huge role.

Even if you have a camera that doesn't provide you a raw, only a JPEG, that JPEG was processed somehow from raw data.

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

No but the JPEG came from a raw. How the raw was rendered plays a huge role.

Even if you have a camera that doesn't provide you a raw, only a JPEG, that JPEG was processed somehow from raw data.

 

So,, a 24Mo or 37 Mo raw fiie converted in Jpegs’ I will not see any difference on both monitors? Correct ?

Link to comment
Share on other sites

So,, a 24Mo or 37 Mo raw fiie converted in Jpegs’ I will not see any difference on both monitors? Correct ?

You are asking me to speculate and I will not do so. You may see a huge difference or very little again based on an enormous number of variables. Even from 2 different brands of 24 MP camera's.

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...