Hector Javkin Posted November 14, 2015 Share Posted November 14, 2015 <p>I recently purchased a new NEC PA272w on sale and, to make use of its 10-bit per color gamut, I have to change to a new video card. My power supply is more than enough: 850 watts. I was advised to change cards, rather than adding one, and I don't believe I have a spare slot anyway. The instructions call for removing the driver for the original card, shutting down the computer, changing cards, booting, and then installing the driver for the new card.</p> <p>What I don't understand is this: how will the monitor work before its driver is installed?</p> Link to comment Share on other sites More sharing options...
Uhooru Posted November 14, 2015 Share Posted November 14, 2015 <p>Make sure you unplug the computer before you open it up. Windows should have a vanilla via driver that will give you basic video, at least they used to before I stopped using them a few years ago.</p> Link to comment Share on other sites More sharing options...
richard_meyers Posted November 14, 2015 Share Posted November 14, 2015 <p>Yes. Windows will use a built-in default driver or how else would you be able to see any video to install a driver.<br> FYI - Many newer video cards have a power supply attachment independent of the bus. Generally, if your PS doesn't have the necessary cable you can buy an adapter to go from a standard 12v/5v molex connector to your video card.</p> Link to comment Share on other sites More sharing options...
q.g._de_bakker Posted November 14, 2015 Share Posted November 14, 2015 And the monitor will behave as a generic VGA monitor, with Windows examining its capabilities and selecting a mode (synchronisation and resolution) that works. Link to comment Share on other sites More sharing options...
Wouter Willemse Posted November 14, 2015 Share Posted November 14, 2015 <p>Please do note that so far, only the professional range cards with AMD or Nvidia chips support 10-bits. So you'll need a Nvidia Quadro or AMD FireGL based cards - they do tend to cost quite a bit more.</p> Link to comment Share on other sites More sharing options...
Hector Javkin Posted November 14, 2015 Author Share Posted November 14, 2015 <p>Thank you Barry, Richard and Q.G. I now understand the process. It makes sense that there should be some default driver.</p> <p>Thank you Wouter. I'm planning to get one the Nvidia Quadro cards, either the K620 or the K2200.</p> Link to comment Share on other sites More sharing options...
AlanKlein Posted November 15, 2015 Share Posted November 15, 2015 <p>I have an AMD Radeon R9 270. Does it handle 10 bit color gamut?</p> Flickr gallery: https://www.flickr.com/photos/alanklein2000/albums Link to comment Share on other sites More sharing options...
Wouter Willemse Posted November 15, 2015 Share Posted November 15, 2015 Alan, no, the Radeon cards do not support 10-bits colour output. Link to comment Share on other sites More sharing options...
AlanKlein Posted November 15, 2015 Share Posted November 15, 2015 <p>Wouter: So what does that mean? I have the NEC PA242W monitor with Spectroscope III puck and software for calibrating the monitor.</p> Flickr gallery: https://www.flickr.com/photos/alanklein2000/albums Link to comment Share on other sites More sharing options...
brad_smith8 Posted November 15, 2015 Share Posted November 15, 2015 <p>You know you can drive a 10 bit monitor with any card right? Do you think you'll see a difference? I would only do this if I regularly used my monitor for scientific measurements with a colorimiter, IMO that's the only thing that could see a difference.</p> Link to comment Share on other sites More sharing options...
Wouter Willemse Posted November 15, 2015 Share Posted November 15, 2015 <p>Alan, not sure if I understand your question; you ask me if your card support 10-bits output, and the answer is no. The monitor, calibration and whatever other software you have doesn't matter. The card does not support 10-bits output.<br> <br /> Practically, it means you'll have 8 bits per channel instead. Like all of us with all other screens, and, for nearly all of us that's perfectly fine, as Brad points out. The additional precision that 10 bits per colour channel can give is hard to spot. Furthermore, the 10-bits option requires an supporting operating system (Windows 7 and later; I don't know if the new OS X supports it, but at least the last one did not), supporting drivers (Quadro and FireGL only), application support (Photoshop does support it, many others do not), plus the screen. So roughly put: lots of requirements, limited benefit for most users. If I'd have a supporting system, I'd use it, but I doubt if I would spend any money on upgrades just to get this feature working.</p> Link to comment Share on other sites More sharing options...
AlanKlein Posted November 15, 2015 Share Posted November 15, 2015 <p>Wouter. So now I'm confused. I've been told I should use Lightroom and not Photoshop Elements because Elements only provides 8 bits per channel where as LR provide more (10, 12??). I assume that reflects n the print quality. So, is there a relationship between the limited 8 bits which is all my display card can give and the 8 bits per prints or are the two separate issues? And how?</p> Flickr gallery: https://www.flickr.com/photos/alanklein2000/albums Link to comment Share on other sites More sharing options...
Wouter Willemse Posted November 16, 2015 Share Posted November 16, 2015 <p>Alan, you're confusing a number of things that aren't completely related.<br> Lightroom, Photoshop and most other professional graphics packages can work on image files with 16 bits per channel information. This higher bit count results in more accuracy and precision in internal calculations (=edits), less risk of issues of visual data loss, especially when applying heavy edits. So, it ensures your source material (=image files) is handled with the highest possible accuracy and precision, and retains as much data as possible to reduce risks of visual issues when using this data for output.</p> <p>Output being:<br> Display output; controlled by videodriver, monitor capabilities above all. Most LCD displays do 6 bits per channel, the better ones 8 bits and the best can do 10 bits. Normal video drivers do 8 bits per channel, only Windows drivers for professional cards can do 10 bits.<br> Print output; on most consumer printers uses 8 bits per channel, better printers can deal with 16 bits per channel.<br> In both these two cases, the bit count refers to the theoretical maximum number of gradations the device is capable of displaying for each colour channel.</p> <p>The key point is that you want to use as much data as possible while editing, to avoid loosing data, accuracy or precision. The final output (screen/print) imposes its own (independent!) limits anyway, and to avoid adding limitation upon limitation (which would mean serious visual degradation) it's seriously best to keep as much data as possible all the way to work on. So the 8-bits limitation of your video card has little to do with the 8-bits limitation in PS Elements. But combine the two, and you'll find that extreme edits will cause visual issues. In moderate editing, the risks are much lower, but still it's better practise to avoid the issue alltogether.</p> <p><br />Sorry, I cannot explain more simple and straightforward than this, if you need more info please do try a search on search engines as I am sure there are places where it is explained better. To the OP, sorry for straying well off-topic.</p> Link to comment Share on other sites More sharing options...
AlanKlein Posted November 16, 2015 Share Posted November 16, 2015 <p>Ok so I continue using LR for best results and not worry about the 8 bit per channel display. The display sen by other monitors will be what there monitor processor handle and the print process in unaffected. So I'm good to go with the graphics card I have now. Thanks Wouter and sorry to Hector for straying as well.</p> Flickr gallery: https://www.flickr.com/photos/alanklein2000/albums Link to comment Share on other sites More sharing options...
Hector Javkin Posted November 17, 2015 Author Share Posted November 17, 2015 <p>No need to apologize. My rather simple question had already been answered, and the continuing discussion made it a more interesting thread.</p> Link to comment Share on other sites More sharing options...
digitaldog Posted November 20, 2015 Share Posted November 20, 2015 <blockquote> <p>Ok so I continue using LR for best results and not worry about the 8 bit per channel display</p> </blockquote> <p>Actually your NEC PA is a high bit panel and that takes care of many of the '<em>issues</em>' of lower bit depth video path. No, you will not see a perfectly smooth gradient depending on how it's built, compared to what you'd see IF you had a full high bit video path. But you have a superb display that's doing a lot of work in avoiding banding on-screen. <br> While a full high bit path is nice, it's a bit '<em>over sold</em>' IMHO. If I were building a system from scratch (that be on a Mac), and I was certain the new OS supports that (still not sure), I'd go for a video card that I knew also supported 10-bits for a full high bit video path. But short of that, having a PA or similar display is plenty good, I don't see the reason to run out and get an expensive video card just for that functionality. </p> Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com) Link to comment Share on other sites More sharing options...
EricM Posted November 25, 2015 Share Posted November 25, 2015 <p>http://www.tedlansingphotography.com/blog/?p=287</p> <p>https://luminous-landscape.com/finally-here-10-bit-lcd-graphic-monitors/</p> <blockquote> <p>If I were building a system from scratch (that be on a Mac), and I was certain the new OS supports that (still not sure)</p> </blockquote> <p> <br> El Capitan does support 10-bit</p> Link to comment Share on other sites More sharing options...
digitaldog Posted November 25, 2015 Share Posted November 25, 2015 <blockquote> <p>El Capitan does support 10-bit</p> </blockquote> <p><em>Maybe</em>...</p> <blockquote> <p><br /> https://luminous-landscape.com/finally-here-10-bit-lcd-graphic-monitors/</p> </blockquote> <p>We've had high bit displays for years, nothing new here in that old article. What's been missing, and what Apple hasn't specifically told anyone as yet, is does El Capitan support it for all displays that have a high bit path. Some sites have <em>suggested</em> it does, I've seen no proof it does. This was specifically asked on the Apple ColorSync user list, the answer from Apple remains unanswered.</p> Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com) Link to comment Share on other sites More sharing options...
EricM Posted November 25, 2015 Share Posted November 25, 2015 <blockquote> <p><em>Maybe</em>...<br> </p> </blockquote> <p> </p> <p>"Among the many new features in OS X El Capitan, it seems Apple has silently integrated another one: 10 bit color for the 4K & 5K iMac. Very interesting news for colorists, photographers, and editors.</p> <p>A cinema5D reader reported that he got 10 bit on a Mac Pro with D500 graphics and an Eizo CS230 monitor. Also, currently it only works within the Preview and Photos applications. If you want to test it out, you could take a 12-bit RAW photo with soft color gradations and take a look. But it’s also important to note that, for now, no other apps, such as Adobe or other editing software, take advantage of this processing, yet. This is just a preview of what’s to come. For those who have been waiting for this feature for a long time, it’s important news."</p> <p>https://www.cinema5d.com/5k-imac-10-bit-color/</p> <p> </p> Link to comment Share on other sites More sharing options...
digitaldog Posted November 25, 2015 Share Posted November 25, 2015 <ol> <li>Among the many new features in OS X El Capitan, it seems Apple has silently integrated another one: 10 bit color for the 4K & 5K iMac. Very interesting news for colorists, photographers, and editors.</li> </ol> <p>Eric, do you understand the sentence that explicably says <em><strong>it seems</strong>?</em></p> <blockquote> <p><em>A cinema5D reader reported that he got 10 bit on a Mac Pro with D500 graphics and an Eizo CS230 monitor. Also, currently it only works within the Preview and Photos applications<br /></em></p> </blockquote> <p>Yet according to Chris Cox of Adobe, Photoshop has had support for this for awhile. <em>It seems</em> <strong>maybe</strong> this OS support isn't fully or even partially implemented. SEE if you can find anything from Apple that officially states there's now a true, high bit OS display path.</p> Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com) Link to comment Share on other sites More sharing options...
EricM Posted November 25, 2015 Share Posted November 25, 2015 I'm comfortable with the information and believe it to be true. If you feel otherwise, then it's up to you post contrary info from Apple. Don't knock yourself out though as it means little to me, a Windows user. Link to comment Share on other sites More sharing options...
digitaldog Posted November 25, 2015 Share Posted November 25, 2015 As usual you've missed the point Eric. I said maybe and that nothing official has been announced by Apple. Maybe means it is possible. You can believe a site that clearly wrote "it seems" but I prefer facts from sources that know not guess. As usual, a request that you back up a claim goes ignored. I've got a number of sources looking into this, sources you can't call upon. When I get facts, I'll pass actual data onto the group. I'd prefer the facts to be, high bit display IS now real in Mac OS! But the so called data you provided clearly illustrated a maybe. Trust but verify: try it sometime. Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com) Link to comment Share on other sites More sharing options...
EricM Posted November 25, 2015 Share Posted November 25, 2015 <p>Flippant, as usual. Please just link us up with facts instead of your wishy washy bs. </p> Link to comment Share on other sites More sharing options...
digitaldog Posted November 26, 2015 Share Posted November 26, 2015 <blockquote> <p>Please just link us up with facts instead of your wishy washy bs.</p> </blockquote> <p>Exactly what I requested of you. </p> Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com) Link to comment Share on other sites More sharing options...
q.g._de_bakker Posted November 26, 2015 Share Posted November 26, 2015 Andrew, you are putting up a "maybe" against an "it seems", both expressing a degree of incertitude. Serves no purpose. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now