Jump to content

Installing a new video card (capable of 10-bit gamut) in a Windows 7 PC


Recommended Posts

<p>I recently purchased a new NEC PA272w on sale and, to make use of its 10-bit per color gamut, I have to change to a new video card. My power supply is more than enough: 850 watts. I was advised to change cards, rather than adding one, and I don't believe I have a spare slot anyway. The instructions call for removing the driver for the original card, shutting down the computer, changing cards, booting, and then installing the driver for the new card.</p>

<p>What I don't understand is this: how will the monitor work before its driver is installed?</p>

Link to comment
Share on other sites

<p>Yes. Windows will use a built-in default driver or how else would you be able to see any video to install a driver.<br>

FYI - Many newer video cards have a power supply attachment independent of the bus. Generally, if your PS doesn't have the necessary cable you can buy an adapter to go from a standard 12v/5v molex connector to your video card.</p>

Link to comment
Share on other sites

<p>Alan, not sure if I understand your question; you ask me if your card support 10-bits output, and the answer is no. The monitor, calibration and whatever other software you have doesn't matter. The card does not support 10-bits output.<br>

<br /> Practically, it means you'll have 8 bits per channel instead. Like all of us with all other screens, and, for nearly all of us that's perfectly fine, as Brad points out. The additional precision that 10 bits per colour channel can give is hard to spot. Furthermore, the 10-bits option requires an supporting operating system (Windows 7 and later; I don't know if the new OS X supports it, but at least the last one did not), supporting drivers (Quadro and FireGL only), application support (Photoshop does support it, many others do not), plus the screen. So roughly put: lots of requirements, limited benefit for most users. If I'd have a supporting system, I'd use it, but I doubt if I would spend any money on upgrades just to get this feature working.</p>

Link to comment
Share on other sites

<p>Wouter. So now I'm confused. I've been told I should use Lightroom and not Photoshop Elements because Elements only provides 8 bits per channel where as LR provide more (10, 12??). I assume that reflects n the print quality. So, is there a relationship between the limited 8 bits which is all my display card can give and the 8 bits per prints or are the two separate issues? And how?</p>
Link to comment
Share on other sites

<p>Alan, you're confusing a number of things that aren't completely related.<br>

Lightroom, Photoshop and most other professional graphics packages can work on image files with 16 bits per channel information. This higher bit count results in more accuracy and precision in internal calculations (=edits), less risk of issues of visual data loss, especially when applying heavy edits. So, it ensures your source material (=image files) is handled with the highest possible accuracy and precision, and retains as much data as possible to reduce risks of visual issues when using this data for output.</p>

<p>Output being:<br>

Display output; controlled by videodriver, monitor capabilities above all. Most LCD displays do 6 bits per channel, the better ones 8 bits and the best can do 10 bits. Normal video drivers do 8 bits per channel, only Windows drivers for professional cards can do 10 bits.<br>

Print output; on most consumer printers uses 8 bits per channel, better printers can deal with 16 bits per channel.<br>

In both these two cases, the bit count refers to the theoretical maximum number of gradations the device is capable of displaying for each colour channel.</p>

<p>The key point is that you want to use as much data as possible while editing, to avoid loosing data, accuracy or precision. The final output (screen/print) imposes its own (independent!) limits anyway, and to avoid adding limitation upon limitation (which would mean serious visual degradation) it's seriously best to keep as much data as possible all the way to work on. So the 8-bits limitation of your video card has little to do with the 8-bits limitation in PS Elements. But combine the two, and you'll find that extreme edits will cause visual issues. In moderate editing, the risks are much lower, but still it's better practise to avoid the issue alltogether.</p>

<p><br />Sorry, I cannot explain more simple and straightforward than this, if you need more info please do try a search on search engines as I am sure there are places where it is explained better. To the OP, sorry for straying well off-topic.</p>

Link to comment
Share on other sites

<p>Ok so I continue using LR for best results and not worry about the 8 bit per channel display. The display sen by other monitors will be what there monitor processor handle and the print process in unaffected. So I'm good to go with the graphics card I have now. Thanks Wouter and sorry to Hector for straying as well.</p>
Link to comment
Share on other sites

<blockquote>

<p>Ok so I continue using LR for best results and not worry about the 8 bit per channel display</p>

</blockquote>

<p>Actually your NEC PA is a high bit panel and that takes care of many of the '<em>issues</em>' of lower bit depth video path. No, you will not see a perfectly smooth gradient depending on how it's built, compared to what you'd see IF you had a full high bit video path. But you have a superb display that's doing a lot of work in avoiding banding on-screen. <br>

While a full high bit path is nice, it's a bit '<em>over sold</em>' IMHO. If I were building a system from scratch (that be on a Mac), and I was certain the new OS supports that (still not sure), I'd go for a video card that I knew also supported 10-bits for a full high bit video path. But short of that, having a PA or similar display is plenty good, I don't see the reason to run out and get an expensive video card just for that functionality. </p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<blockquote>

<p>El Capitan does support 10-bit</p>

</blockquote>

<p><em>Maybe</em>...</p>

<blockquote>

<p><br /> https://luminous-landscape.com/finally-here-10-bit-lcd-graphic-monitors/</p>

</blockquote>

<p>We've had high bit displays for years, nothing new here in that old article. What's been missing, and what Apple hasn't specifically told anyone as yet, is does El Capitan support it for all displays that have a high bit path. Some sites have <em>suggested</em> it does, I've seen no proof it does. This was specifically asked on the Apple ColorSync user list, the answer from Apple remains unanswered.</p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

<blockquote>

<p><em>Maybe</em>...<br>

</p>

</blockquote>

<p> </p>

<p>"Among the many new features in OS X El Capitan, it seems Apple has silently integrated another one: 10 bit color for the 4K & 5K iMac. Very interesting news for colorists, photographers, and editors.</p>

<p>A cinema5D reader reported that he got 10 bit on a Mac Pro with D500 graphics and an Eizo CS230 monitor. Also, currently it only works within the Preview and Photos applications. If you want to test it out, you could take a 12-bit RAW photo with soft color gradations and take a look. But it’s also important to note that, for now, no other apps, such as Adobe or other editing software, take advantage of this processing, yet. This is just a preview of what’s to come. For those who have been waiting for this feature for a long time, it’s important news."</p>

<p>https://www.cinema5d.com/5k-imac-10-bit-color/</p>

<p> </p>

Link to comment
Share on other sites

<ol>

<li>Among the many new features in OS X El Capitan, it seems Apple has silently integrated another one: 10 bit color for the 4K & 5K iMac. Very interesting news for colorists, photographers, and editors.</li>

</ol>

<p>Eric, do you understand the sentence that explicably says <em><strong>it seems</strong>?</em></p>

<blockquote>

<p><em>A cinema5D reader reported that he got 10 bit on a Mac Pro with D500 graphics and an Eizo CS230 monitor. Also, currently it only works within the Preview and Photos applications<br /></em></p>

</blockquote>

<p>Yet according to Chris Cox of Adobe, Photoshop has had support for this for awhile. <em>It seems</em> <strong>maybe</strong> this OS support isn't fully or even partially implemented. SEE if you can find anything from Apple that officially states there's now a true, high bit OS display path.</p>

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

As usual you've missed the point Eric. I said maybe and that nothing official has been announced by Apple. Maybe

means it is possible. You can believe a site that clearly wrote "it seems" but I prefer facts from sources that know not

guess. As usual, a request that you back up a claim goes ignored. I've got a number of sources looking into this, sources

you can't call upon. When I get facts, I'll pass actual data onto the group. I'd prefer the facts to be, high bit display IS now

real in Mac OS! But the so called data you provided clearly illustrated a maybe. Trust but verify: try it sometime.

Author “Color Management for Photographers" & "Photoshop CC Color Management" (pluralsight.com)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...