New Dell monitors with true hardware calibration, 99% aRGB coverage

Discussion in 'Digital Darkroom' started by zoltan_arva_toth, Feb 15, 2013.

  1. Have you seen these monitors yet? The 24" and 27" models could be interesting for photographers:
    http://www.photographyblog.com/news/dell_ultrasharp_u2413_u2713h_and_u3014/
     
  2. A question: does Dell make its own monitors? Or put its label on someone else's? I'm asking.
     
  3. Doubt it.
    Probably a custom design to Dell specifications by a contract OEM.
     
  4. All the connections are nice. Mac users should really like this. From the reviews, it sounds as though there may be some hardware and software bugs to work out still but it sounds like there should be potential for future updated versions. B&H had a google ad showing $844 when I did a search for the 27". NEC and Eizo might need to up their game and lower their prices!
     
  5. reading the review, it's incredibly lame that the ONLY hardware calibration device is the i1DisplayPro. Way to go XRite!
     
  6. First, there are only two or three LCD manufacturers out there (LG is one for sure). So no, Dell doesn't make the panels.
    In our understanding, the UltraSharp U2413 is one of the most affordable 24” monitors to supporthardware calibration with an X-rite i1DisplayPro colorimeter and accompanying software, which allows users to program the device’s 14-bit LUT.​
    Their understanding is wrong! All SpectraView displays can do this. Eizo does too. Now if you want to discuss affordability, that's one thing. But hardware calibration in the panel with the X-rite device(s) is not new.
    reading the review, it's incredibly lame that the ONLY hardware calibration device is the i1DisplayPro. Way to go XRite!​
    Got nothing to do with X-rite. I drive my SpectraView II with that unit with hardware calibration within the panel.
    About percentage (of this or that color space) there is a lot of confusion about this in the industry the de facto standard when discussing display gamut sizes is currently to use the gamut area, calculated in CIE xy, relative to a reference gamut and expressed as a percentage. If the reference color gamut is unspecified, it is assumed to be NTSC (1953) - not real useful since it's not in use and makes things more confusing, especially for those doing video work.

    Another confusing point about this figure is that it does not say what portion of the 2 gamuts overlap, so it would be possible to have a very large % gamut area, but only have a smaller portion of it actually covering the reference gamut.

    NEC use 2 sets of figures: "Percent Area" and "Percent Coverage".
    The "Percent Area" is simply the area in CIE xy of the display gamut vs the reference gamut, with no consideration of how much of the gamuts actually overlap. This value can be > 100%. The "Percent Coverage" is the overlapping area of the 2 gamuts expressed as a percent of the total area of the reference gamut. The maximum possible value for this is 100%.
    Bottom line, when looking at a wide gamut display, one of the least important specifications is the percentage values compared to other items to look at.
     
  7. I disagree here. So Dell contracts w/ someone to make the finished panels and they contract w/ XRite to create a customized piece of software to drive the internal display hardware ala Spectraview. So who decided that Colormunki, i1Pro and Spyder canNOT be used ? Did Dell not know any better and let XRite say 'well, we'll only write the software to support our latest and greatest' ???
     
  8. Spearhead

    Spearhead Moderator

    there are only two or three LCD manufacturers out there​
    For photography uses, this is correct. The manufacturers are Sharp, LG and Samsung. There are a number of companies that manufacture smaller LCD panels, but these don't go into monitors that would be used for photo editing. Some of them could eventually make larger panels, but profits are slim so it may not look that exciting to them.
    Sharp is in bad condition financially and losing a lot of money in the LCD panel business, so it may go down to two manufacturers unless some company other than Samsung or LG want to buy Sharp's panel business.
     
  9. So Dell contracts w/ someone to make the finished panels and they contract w/ XRite to create a customized piece of software to drive the internal display hardware ala Spectraview. So who decided that Colormunki, i1Pro and Spyder canNOT be used ?​
    Before you get out the pitchforks and touches for X-rite, it be useful to understand your options. There are two ways to work with the i1DPro Colorimeter. One is, you buy it with the X-rite software and use it any and everywhere you want.
    For less money, at least with NEC, you can get a bundle which is the same instrument but can only be used on that system due to a setting in the instrument. The SpectraView software supports other devices (Spyder, ColorMunki etc). If you own one, just buy the software alone.
    You want to use the i1DPro hardware on multiple devices outside the SpectraView? No problem, you do not get the NEC bundle, and like me, you can use the i1DPro on the SpectraView II and your Macbook. Cost more. You use the X-rite software on the Macbook, you use the SpectraView II software on that display with the same device.
    I have no idea what Dell is doing here that is similar or different.
     
  10. The LCD is just one part in a monitor assembly. There may be only three companies making LCDs for computer monitors, but that doesn't mean that there aren't many different models of panels. It's like saying all monitors with LG panels are the same. Might as well say that ALL digital cameras with Sony sensors are the same.
     
  11. My point is that I already have a Colormunki Photo and it works fine (not going down the road about debating it's dark performance). If I wanted to purchase this Dell monitor, then I would HAVE to purchase a new i1DisplayPro since that is the only puck the Dell software (written by XRite) supports for the display's internal LUTs for hardware calibration.
     
  12. My point is that I already have a Colormunki Photo and it works fine​
    And WOULD work fine in the SpectraView software. Dell, don't know. But that's not an issue with the hardware and X-rite.
    If I wanted to purchase this Dell monitor, then I would HAVE to purchase a new i1DisplayPro since that is the only puck the Dell software (written by XRite) supports for the display's internal LUTs for hardware calibration.​
    Don't buy the Dell, buy the SpectraView! We also would have to consider the software capabilities of both, warranty etc. Ask Dell why when you buy an i1D2 with the X-rite software, why, unlike NEC it can't work. Again, this is Dell's issue, not X-rite.
     
  13. I have no idea what Dell is doing here that is similar or different.​
    If this is the first time Dell has offered a display with 14bit internal hardware LUTs and the means to calibrate it in that manner, I'ld say THAT is what's different especially at those prices compared to NEC's and Eizo's.
    I think you're going to have to split those hairs a bit more finer in determining Dell's lack of price to quality they're offering to photographers, Andrew. I look forward to seeing what you find.
     
  14. The 14 bit capability is mostly a marketing spec that doesn't really mean much. Yes, more than 8-bits per color is ideal and yes, having this done in the panel is ideal.
    I'd be willing to spend more on an NEC than Dell if the software to calibrate the display has the necessarily capabilities (very robust control over white point, contrast ratio, ability to load on the fly calibration's and profiles, trending, emulation of other color spaces etc). I'd take a 12 bit display with those functions over a 14 bit that doesn't any day!
    Getting back to panels and the few companies who make them, keep in mind that not all the panels off the floor are of equal quality! We don't know if Dell get's the 2nd's after the pick of the litter goes to NEC/EIZO or not. So there's more to this than just the price or percentage of gamut or the number of bits.
     
  15. 14 bits is nice but the reality is that most people don't have computer configurations (e.g. both video cards AND drivers) that can fully support 14 bit output.
     
  16. 14 bits is nice but the reality is that most people don't have computer configurations (e.g. both video cards AND drivers) that can fully support 14 bit output.​
    Well on Mac, that's currently impossible (thanks to Apple). Everything else in the chain (display, applications etc) are ready to go. But having a higher bit depth in the panel is still useful with a wide gamut display. To avoid banding. 10 or 12 bits should be more than enough to do this.
    When you hear these people writing in their marking material "with 14 bits you get billions of colors" the BS alert should kick in! The math is correct. The fact we can't see anything close to even 24 bit (8-bit per color) or 16.7 million colors should noted.
     
  17. Doug, the 14 bits is in regard to the precision within the Dell's internal LUT in delivering linear tone distribution so there's no banding in a black to white gradient. A null/flat curve is applied to the 8 bit video card so the linearity isn't performed by applying curves in the video LUT. The 14 bit precision linearity curves are built and applied within the display's hardware LUT.
    But it still doesn't give a clue of the quality of the color matrixes built into a LUT-less ICC display profile used by color managed apps in how it will line up with the 14 bit hardware LUT which is my guess what Andrew's referring to about precision of color gamut claims.
     
  18. Hi Tim - Sorry, I was not correct or clear in my last post. I meant speak in terms of 10 bit (not 14) versus 8 bit going into the monitor. After a demo at a local dealer who had a full 10 bit delivery path, I was able to see a slight difference in the gradient created. My intended point was that if a buyer is going to chase after the nth degree of perfection, everything in the hardware/data path has to support the higher data rate and for many people that would mean an additional upgrade beyond just the monitor itself. I wasn't motivated to swap out my video card, the only weak link in my data path at this point, when I upgraded to an NEC PA series. It is rather aggravating that my video card's hardware and port could output 10 bit but the video company limits things via the driver. NEC support also told me another problem can be DisplayPort cables that are not manufactured well enough to reliably support the wider data path.
     
  19. Can somebody tell=
    What will be final price for 27" monitor and recommended software etc? total? =
     
  20. Doug,
    A question about the last sentence in your post regarding the DiplayPort cables:
    "NEC support also told me another problem can be DisplayPort cables that are not manufactured well enough to reliably support the wider data path."
    Can you elaborate? Were they referring to the DisplayPort cables that come packaged with the NEC PA series monitors? (if so that's very dissapointng). Did they say what brand of cables would reliably support 10-bits?
    Anybody else aware of this issue?
     
  21. No, they were not referring to their cables. They were referring to third-party cables. Some that show the logo for certification still exhibit problems. If you buy through NEC, I am quite confident you will be OK. You can find references to this potential problem on the NEC site if you look around.
    [​IMG]
     
  22. EricM

    EricM Planet Eric

    14 bits is nice but the reality is that most people don't have computer configurations (e.g. both video cards AND drivers) that can fully support 14 bit output.
    Well on Mac, that's currently impossible (thanks to Apple). Everything else in the chain (display, applications etc) are ready to go.​
    What about Windows? Can it support 14 bit output?
     
  23. According to this review of the U2413:

    http://www.tftcentral.co.uk/reviews/dell_u2413.htm
    One thing which separates this screen from many mainstream monitors, including the previous Dell 27" offerings, is the support for hardware calibration. Users can program the monitors 14-bit Look Up Table (LUT) if they have the appropriate software and hardware to achieve higher levels of accuracy, ...​
    But
    Hardware calibration very inflexible with regards to compatible devices and lacking reporting function​
     

Share This Page