Jump to content

Dell U2711, Spyder3 Elite - disgusting grayscale (long)


Recommended Posts

<p>Hello.<br />Obviously there is a problem that I'm trying to solve and I know that it is not limited to my specific scenario. Instead of just trying to get a quick and easy fix I thought to have this issue addressed in a more comprehensive manner so that other folks could benefit from this in the future. <br /><br />Explaining the whole situation turned out to be fairly long, so I divided this into 2 parts. You can just skip to the 'technical stuff' right away.<br /><br /><br />PREFACE:<br /><br />I've been very happy with Spyder2 and my old CRT display. Got me nice screen-to-print matching with their patch reader too. I've also done quite a lot of BW post-processing and have been happy with it. Obviously something good like that couldn't last. There came the time when I had no choice but to migrate to an LCD panel. I did a lot of poking around and posting, and have been led to believe that Dell U2711 was good enough for what I needed... We all make mistakes... Anyway, Spyder2 wasn't working anymore with my new panel so I got me a Spyder3. To be on the safe side I chose the Elite edition.<br /><br />Now, I realize that by the magnitude of its complexity, color accuracy and matching is a field worthy of a gentleman's lifetime dedication. I'm not a certified calibrator, nor do I wish to become one but apparently even in the year 2012 one still has to possess an impressive set of skills just to ensure that some backward software doesn't ruin his display. That's right, in my case the picture is substantially better *without* calibration.<br /><br />Based on my prior experience I chose to go for Spyder3; to accurately calibrate my panel - hassle-free, and keep it that way. Contrary to my expectations, Spyder3 had done a very sloppy job. Unfortunately, in the 2 years of using the product I haven't realized that. Simply (and foolishly) I *trusted* Datacolor to yield the optimal calibration with minimal user intervention, just the way it was accomplished with my CRT monitor.<br /><br />When the time came for me to do some BW post-processing I started noticing various color casts all over the grayscale. That's when the things started to go bad for me. I had no idea my display was *that much* out of whack. I've spent pretty much the whole week fumbling with the spectro, the software and my panel. Don't know how much the panel is to blame, but quite frankly I've grown to loathe them all with a passion. The OLEDs aren't here yet, so I'm pretty much stuck with this setup. Naturally, I want to make the best of this bad situation. Please, don't recommend me anything else. I swore to myself that this is my first and last LCD *ever*.<br /><br /><br>

THE TECHNICAL STUFF:<br /><br />Windows 7 Pro 64-bit<br />ATI Radeon HD 5670 1GB<br />Dell U2711 panel connected via DP, set to maximum resolution and bit-depth<br />Spyder3 Elite<br /><br /><br>

The problem:<br>

As the title of this post suggests I have an issue with the way the grayscale appears on my panel. It is not limited strictly to some minor banding. Moreover, if the grayscale doesn't look right, it means that a whole lot of tones are out of whack too.<br />My grayscale pretty much begins at RGB 13,13,13 and naming it a 'grayscale' at this point would be too generous. It's more of a 'green-scale' because it has a distinctive greenish hue. Below 13 it just drops abruptly There are *a few* values below 13 whose densities I can spot using my eye-o-meter, but you'll agree with me that that's not the way it's supposed to look. At around RGB 30, the grayscale suddenly becomes more or less neutral and the brightness keeps increasing up to around 39. At 39 the density drops (becomes darker!) and the grayscale takes on a Red/Magenta tone. It becomes pretty much neutral at some point higher up, though there is still apparent some 'magenta' banding, which is not very severe. I guess the main perpetrator is the poorly engineered software, because the spectro does provide useful readings with some degree of consistency.<br>

I must point out that this calibration is actually the *best* of what I was able to achieve. I'm looking to solve this issue using some other software but before I jump into that, let me share with you the 'fun' times I've had with Spyder3Elite.<br /><br />At this point there's a good chance I know more about the software itself than you do. In my dealing with the problem I decided to be methodical. Instead of just playing around with it, in hope to get lucky, and get me the perfect combination of settings, I attempted to isolate the problematic areas. I then addressed each of those areas separately and did arrive at a point where I can say that I have the best calibration for this particular hardware/software combination. Obviously, this "best" isn't nearly as good as it ought to be.<br />In my tedious endeavor I was pretty much conducting a set of experiments. Everything is documented. I have all of those ICC profiles and I can recreate the results of each of those calibration attempts. I will describe just the interesting stuff.<br>

The slightly odd thing about my calibration is that I want my display to be fairly dim. That would be 71 cd/m2. I work in a completely dark room and that level of brightness is the maximum I can handle without getting headaches. The display settings are such (and I'll explain):<br>

Brightness: 8, Contrast: 50, Custom color; R:99, G:93, B:100<br>

The contrast shouldn't be changed from its factory default (50) because once you increase it, the colors just start going crazy on you, which would only make the calibration more difficult. With the settings mentioned above I can get contrast ratio of about 560:1. Increasing the contrast setting doesn't help there. The Brightness does though. At higher backlighting levels *this* panel *can* produce better contrast ratio. In my case, however, that is out of the question.<br>

As per the software. I can see no difference between the ICC 4.0 and 2.0, so for now I just left it on 4.0. I can always recalibrate in the 2.0 format. I did get slightly better results with the Chromatic adaptation set to XYZ scaling vs. Bradford, so I'm leaving it there.<br>

In 'display type'/identify controls I ticked 'Contrast' and 'RGB Sliders'.<br />In Expret Console I selected a white point of 6500k and Gamma 2.2.<br>

Luminance: Visual Mode (I'll tell you why). Gray balance: OFF, FullCAL.<br />Now.. What I found out is that letting the software mess with the white and black point is the worst thing you can do to your grayscale. So, basically, all I'm asking from my calibration is to adjust the gamma and take care of any color casts.<br>

In the initial stage of the calibration process, the spectro takes a few basic readings and then lets me adjust the settings on the display to bring them closest to what I'm shooting for. By adjusting the RGB sliders I can get within 50 degrees from 6500K. Every time I click 'update' I can also see the brightness output. So I adjust the Brightness on the panel to get the closest to my 71 cd/m2. Yes, this is how I set my "target brightness". I set it using only the hardware. Using the method described above I can always set my panel to that specific brightness level with good degree of accuracy.<br>

As soon as I'm satisfied with the readout of the spectro I proceed to the actual calibration, which from that point is completely automatic. When it's all over, the grayscale looks like what I described earlier. By comparison the Uncalibrated mode looks substantially better than the Calibrated. The Grayscale is almost completely neutral throughout and there is hardly any appreciable banding. The colors look right, however the gamma appears just a tad off (brighter). You could say that I should just settle for it, and I'm honestly thinking about it. I could correct the gamma in my video card settings but before I take that route I want to make sure there's no better way to do it.<br /><br />Why did I choose to turn the 'Gray Balance Calibration' OFF? It would seem that running an 'Iterative' calibration would solve my problem.... What a joke... The results I get with this 'Iterative' calibration are just so much worse. The *joke* is really that the software doesn't even *attempt* to read the densities that I'm having such a problem with. It just reads patches of gray from maybe around RGB 80 and up.. when it should be correcting 1, 2, 3, 4...15.. and then maybe every 2 or 3 samples. And it makes sense that it doesn't do it. The signal-to-noise ratio of this spectro apparently isn't good enough to be poking around in those dark areas. It looks like the engineers at Datacolor knew that but simply went: "To hell with it.. we'll just make the spectro read the lighter intensities it has no problem with and the low end will turn out just fine.. If not, who's gonna care or even notice?..." Funny, no? I bet that is exactly how it went down! How else can you explain such negligence? <br /><br />Adjusting the Curves:<br>

I don't know which is more useless the 'Gray Balance Calibration' or the 'Edit Curves' tool. Out of the entire spectrum they give you 9 points you can apply *very gross* adjustments to. If you haven't done so already, I recommend you try adjusting your profile using this tool... It's guaranteed to make you laugh.. (or vomit).<br /><br />WHAT I NEED.<br />I need a tool (software), either from Datacolor but hopefully from another manufacturer, to fine-tune my display's profile OR make one from scratch. Fortunately, with my spectro I can read any density on my screen, so I can kind of manually dial in the numbers into a 3rd party ICC profile creator (or whatever). Naturally, I would need a proven and fail-proof strategy there.<br>

Another idea: Since I can get myself very close to my target using just the hardware (settings on the display), maybe I can simply use my canned display profile and only adjust the gamma on my video card? I would need a proper method for it too, should I take that route.<br>

OR MAYBE, just maybe I can still somehow coax my spectro's native software into making me a proper profile. If you're reading this, it means you've probably been there. What is out there to help me out?<br>

One last thing. I haven't really touched on my video adapter. What I have is nothing special really. It is not designated for CAD or optimized for any wide gamut application. I did my best to ensure it's not affecting gamma, color temperature, gamut or anything like that but if you know there's a problem with it I'll replace it in a heart-beat. When I was putting this machine together I actually thought that the parts I was getting were a pretty good combination. No one really knew whether Spyder3 software could take advantage of an LUT that's more than 8 bits deep but I decided to be prepared if it had that capability. The display adapter wasn't specific about its LUTs but I did read in one place on the web where it seemed it had a 10-bit LUT. I cannot confirm it, however. The panel itself is connected via DisplayPort, to ensure the high(er) bit capability. The panel itself has an internal 12-bit LUT, a property I am also taking advantage of. Everything seems to be *right there* to produce a very accurate, quality calibration... Everything except for the deficient software...<br>

PS: This is not about print-matching, as I don't print anymore.</p>

Link to comment
Share on other sites

<blockquote>

<p>The panel itself has an internal 12-bit LUT, a property I am also taking advantage of. </p>

</blockquote>

<p>Not with the Spyder. All those bits are not real helpful if you can’t calibrate the pane itself. I believe Spider is adjusting the LUTs in the graphic system which is outside the high bit panel. But that should be the least of your issues if you are getting really poor gray balance (banding, that would point to the lack of high bit usage). </p>

<p>With all the grief you are reporting, I’d throw the baby (this Dell) out with the bathwater (the Spyder), get a true reference display system designed from the ground up to calibrate within the panel, in high bit, with a really good instrument mated for that display, using software that fully controls the calibration. That be a SpectraView PA series. </p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<p>Thank you but I'll just try to get *this one* calibrated as close as I can to what I need it to be. And if I can't I'll just suck it up until there's a proper and affordable OLED panel out there. No more LCDs for me. I *never* liked them.<br>

With my calibration method I can get the panel *very close* to my target without ever using the custom profile. The way I do it, the calibration is handled by the panel itself and I use the spectro strictly as feedback. I can get awfully close to 6500K using the RGB sliders and there is also a Gamma selector, which I naturally set to 2.2 All those calculations are done in 12 bits. To prove it: there's nothing I would call banding. The picture is almost perfect. However, that's not quite close enough to where I need to get. I still want to fine-tune the gamma using a custom tailored profile as well as to make sure that every density throughout the gamut is right where it needs to be (or at least is very close). Please read the section titled: What I need. There are a few feasible options on the table but one I'd prefer is to use an 'ICC plotter' to manually plot a custom display profile. I know there's stuff out there to do just that. Naturally I'm looking for something that costs nothing or at least is very affordable. Maybe my Spyder3 Elite license makes me eligible to purchase something like that from Datacolor at a discount. It would probably be the last thing I ever bought from them.</p>

Link to comment
Share on other sites

<p>I didn't read your entire post, so correct me with what I'm about to suggest...</p>

<p>As Andrew indicated the 12 bit internal hardware LUT is useless if you don't have software to take advantage of it. The fact you have an odd ball OLED type panel with no software/hardware colorimeter designed to accurately measure and profile those type of colorants also suggests YOU ARE BETTER OFF WITH AN LCD especially if you're just using the 8 bit video LUT to generate a decent grayscale.</p>

<p>The fact you can't even get that also suggests you're wasting time and money on this OLED choice. I've got a regular sRGB gamut Dell 2209WA that gives excellent calibrated/profiled grayscale and color management to where my prints match my display. I paid $300 for it.</p>

<p>Mine works, yours doesn't. Do the math. </p>

Link to comment
Share on other sites

<p>Because the Dell U2711 is a wide-gamut monitor, conventional colorimeters may not be suitable for building profiles for it. Your Spyder3Elite, as I understand it, is not a spectrophotometer; rather, it's a colorimeter with 7 filters (potentially better than one w/ 3 or 4 filters, but, still not a spectrophotometer).</p>

<p>Therefore, it's possible that the Spyder3Elite's filters are just not suitable for the U2711. You can build a correction matrix using the colorimeter & a spectrophotometer (e.g. ColorMunki) with the DispcalGUI software that Mike suggested. Or just use a ColorMunki in 'Adaptive Hi-Res' mode with the DispcalGUI software. I highly recommend that software package. The 'Adaptive' mode attempts to compensate for the deficiencies in low-cost spectrophotometers (namely that they are inaccurate when reading dark colors b/c of the low signal:noise ratio... 'adaptive' mode increases the integration time to get a better reading).</p>

<p>As I understand it, you can take advantage of the 12-bit LUT on the U2711 by doing as much of the calibration as possible on the monitor itself. That includes RGB gain sliders & offset sliders (for setting the white point & black point, respectively). Getting these as close to the target as possible should mean that less corrections need to be done in the 8-bit LUT of the video card, which should mean less rounding errors (and hopefully less banding).</p>

<p>On the other hand, Andrew's suggestion of the NEC seems quite desirable since the 14-bit LUT is addressable by the hardware (which I assume is a colorimeter with filters matched for the wide-gamut display?). Also probably means less work & human error. In retrospect, I should've probably purchased one of those.</p>

<p>Instead, I purchased the U2711 & a ColorMunki... which comes out to the same price as the NEC monitor. Well hindsight is 20/20, right?</p>

<p>In my experience, I get good results profiling my U2711 w/ a ColorMunki using 'Adaptive Hi-Res' mode in dispcalGUI. I also get almost exactly the same profile when using the ColorMunki to build a correction matrix for my i1 Display 2 colorimeter, which is a nice sanity check. Of course, I don't have a good way to check how accurate my profile is... you can't rule out inter-unit variability in the ColorMunki. But I do get decent matches to prints on profiled printers, so I'm satisfied for the time being.</p>

<p>I also wonder if the NEC panel has better uniformity across the screen. The U2711 has a green-magenta cast problem across the screen (easily viewable on a blank white page) & its luminance can vary by over 20% across the screen (120 cd/m^2 in the center but 95 cd/m^2 near, but still well away from, the corners). I can't say I've been blown away by the U2711.</p>

Link to comment
Share on other sites

<blockquote>

<p>On the other hand, Andrew's suggestion of the NEC seems quite desirable since the 14-bit LUT is addressable by the hardware (which I assume is a colorimeter with filters matched for the wide-gamut display?)</p>

</blockquote>

<p>It is indeed. And using what I’d consider a vastly superior colorimeter.</p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<p>Mr. Lookingbill: My original post is fairly long, and I understand how someone would want to just skim through something like that. You got off topic there but me and the other gentlemen already worked something out.<br /> Mr. Rodney: No one here claims otherwise and in my view, what you have is a good product which is suited to my needs really well. However, like I stated, I need to work with what I have. In a few years, when buying an OLED panel, I'll see what NEC has to offer.<br /> Mr. Sanyal: Thank you! That was a very comprehensive response. I was at a point when I had to decide between another 'spider' and some 'monkey' I didn't know. I went for my Spyder namely because that was something I was already familiar with. I trusted datacolor a second time and was wrong to do that.<br /> Mr. Blume suggested 2 pieces of different software. At a glance they really seem like what I'm looking for. One of them is bound to work for me, so thank you both so much!</p>
Link to comment
Share on other sites

<p>No worries. Hopefully some of this information will help you.</p>

<p>To be clear: dispcalGUI needs Argyll libraries to work. The installation instructions are very clear & well detailed on the main site of dispcalGUI itself, here: http://dispcalgui.hoech.net/</p>

<p>The software works with most common devices (like the Munki) & offers comprehensive options that, in my opinion, make it much better than a lot of manufacturer-bundled software (the Munki software, last I checked, is a joke at best, for example).</p>

<p>-Rishi</p>

Link to comment
Share on other sites

<p>Yuri,<br>

Rishi has explained what I omitted. The Argyll library is an extensive collection of software for calibrating and profiling a monitor. The DispcalGUI is a graphic interface to that software which makes it very easy to implement.</p>

<p>I have used the combination to calibrate and profile my wide gamut monitor (HP2475w), in combination with a X--Rite Color Munki colorimeter, with excellent success.</p>

Link to comment
Share on other sites

<p>I finally got around to testing this tool. I didn't have the time to run a quality calibration but it looks like I'm heading in the right direction. At this point I am technically proficient enough in this area, so I know I'm gonna have fun with this puppy. It's amazing that a tool like that has been out there for quite awhile. It solves my problem better than I could ask for. What's even more amazing is that it's totally free. Google isn't always helpful when it comes to finding something this specific, so for once I'm glad I spent all those hours compiling my original post. Maybe someone in a similar situation will find this discussion helpful someday.<br /> My only regret is that I shelled out those extra bucks on purchasing the elite edition of my Spyder, as that was 100% money down the drain.<br /> Thank you all again!</p>
Link to comment
Share on other sites

<p>I am having so much fun with this little gem. The results I'm getting are better than I thought possible. Now I want to *really* get into it and make the tool read all those problematic samples *one by one*.<br>

What I wanted to share with the folks who are interested is this little line from the log I obtained through "Report on uncalibrated display device" which can be accessed through the "Tools" menu. The line is this:<br>

"Effective LUT entry depth seems to be 10 bits"<br>

What this means is that my display adapter has a 10 bit LUT and that pretty much verifies the speculation I had. This is good news.</p>

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...