Installing a new video card (capable of 10-bit gamut) in a Windows 7 PC

Discussion in 'Digital Darkroom' started by javkin, Nov 14, 2015.

  1. I recently purchased a new NEC PA272w on sale and, to make use of its 10-bit per color gamut, I have to change to a new video card. My power supply is more than enough: 850 watts. I was advised to change cards, rather than adding one, and I don't believe I have a spare slot anyway. The instructions call for removing the driver for the original card, shutting down the computer, changing cards, booting, and then installing the driver for the new card.
    What I don't understand is this: how will the monitor work before its driver is installed?
     
  2. Make sure you unplug the computer before you open it up. Windows should have a vanilla via driver that will give you basic video, at least they used to before I stopped using them a few years ago.
     
  3. Yes. Windows will use a built-in default driver or how else would you be able to see any video to install a driver.
    FYI - Many newer video cards have a power supply attachment independent of the bus. Generally, if your PS doesn't have the necessary cable you can buy an adapter to go from a standard 12v/5v molex connector to your video card.
     
  4. And the monitor will behave as a generic VGA monitor, with Windows examining its capabilities and selecting a mode (synchronisation and resolution) that works.
     
  5. Please do note that so far, only the professional range cards with AMD or Nvidia chips support 10-bits. So you'll need a Nvidia Quadro or AMD FireGL based cards - they do tend to cost quite a bit more.
     
  6. Thank you Barry, Richard and Q.G. I now understand the process. It makes sense that there should be some default driver.
    Thank you Wouter. I'm planning to get one the Nvidia Quadro cards, either the K620 or the K2200.
     
  7. I have an AMD Radeon R9 270. Does it handle 10 bit color gamut?
     
  8. Alan, no, the Radeon cards do not support 10-bits colour output.
     
  9. Wouter: So what does that mean? I have the NEC PA242W monitor with Spectroscope III puck and software for calibrating the monitor.
     
  10. You know you can drive a 10 bit monitor with any card right? Do you think you'll see a difference? I would only do this if I regularly used my monitor for scientific measurements with a colorimiter, IMO that's the only thing that could see a difference.
     
  11. Alan, not sure if I understand your question; you ask me if your card support 10-bits output, and the answer is no. The monitor, calibration and whatever other software you have doesn't matter. The card does not support 10-bits output.

    Practically, it means you'll have 8 bits per channel instead. Like all of us with all other screens, and, for nearly all of us that's perfectly fine, as Brad points out. The additional precision that 10 bits per colour channel can give is hard to spot. Furthermore, the 10-bits option requires an supporting operating system (Windows 7 and later; I don't know if the new OS X supports it, but at least the last one did not), supporting drivers (Quadro and FireGL only), application support (Photoshop does support it, many others do not), plus the screen. So roughly put: lots of requirements, limited benefit for most users. If I'd have a supporting system, I'd use it, but I doubt if I would spend any money on upgrades just to get this feature working.
     
  12. Wouter. So now I'm confused. I've been told I should use Lightroom and not Photoshop Elements because Elements only provides 8 bits per channel where as LR provide more (10, 12??). I assume that reflects n the print quality. So, is there a relationship between the limited 8 bits which is all my display card can give and the 8 bits per prints or are the two separate issues? And how?
     
  13. Alan, you're confusing a number of things that aren't completely related.
    Lightroom, Photoshop and most other professional graphics packages can work on image files with 16 bits per channel information. This higher bit count results in more accuracy and precision in internal calculations (=edits), less risk of issues of visual data loss, especially when applying heavy edits. So, it ensures your source material (=image files) is handled with the highest possible accuracy and precision, and retains as much data as possible to reduce risks of visual issues when using this data for output.
    Output being:
    Display output; controlled by videodriver, monitor capabilities above all. Most LCD displays do 6 bits per channel, the better ones 8 bits and the best can do 10 bits. Normal video drivers do 8 bits per channel, only Windows drivers for professional cards can do 10 bits.
    Print output; on most consumer printers uses 8 bits per channel, better printers can deal with 16 bits per channel.
    In both these two cases, the bit count refers to the theoretical maximum number of gradations the device is capable of displaying for each colour channel.
    The key point is that you want to use as much data as possible while editing, to avoid loosing data, accuracy or precision. The final output (screen/print) imposes its own (independent!) limits anyway, and to avoid adding limitation upon limitation (which would mean serious visual degradation) it's seriously best to keep as much data as possible all the way to work on. So the 8-bits limitation of your video card has little to do with the 8-bits limitation in PS Elements. But combine the two, and you'll find that extreme edits will cause visual issues. In moderate editing, the risks are much lower, but still it's better practise to avoid the issue alltogether.

    Sorry, I cannot explain more simple and straightforward than this, if you need more info please do try a search on search engines as I am sure there are places where it is explained better. To the OP, sorry for straying well off-topic.
     
  14. Ok so I continue using LR for best results and not worry about the 8 bit per channel display. The display sen by other monitors will be what there monitor processor handle and the print process in unaffected. So I'm good to go with the graphics card I have now. Thanks Wouter and sorry to Hector for straying as well.
     
  15. No need to apologize. My rather simple question had already been answered, and the continuing discussion made it a more interesting thread.
     
  16. digitaldog

    digitaldog Andrew Rodney

    Ok so I continue using LR for best results and not worry about the 8 bit per channel display​
    Actually your NEC PA is a high bit panel and that takes care of many of the 'issues' of lower bit depth video path. No, you will not see a perfectly smooth gradient depending on how it's built, compared to what you'd see IF you had a full high bit video path. But you have a superb display that's doing a lot of work in avoiding banding on-screen.
    While a full high bit path is nice, it's a bit 'over sold' IMHO. If I were building a system from scratch (that be on a Mac), and I was certain the new OS supports that (still not sure), I'd go for a video card that I knew also supported 10-bits for a full high bit video path. But short of that, having a PA or similar display is plenty good, I don't see the reason to run out and get an expensive video card just for that functionality.
     
  17. EricM

    EricM Planet Eric

    http://www.tedlansingphotography.com/blog/?p=287
    https://luminous-landscape.com/finally-here-10-bit-lcd-graphic-monitors/
    If I were building a system from scratch (that be on a Mac), and I was certain the new OS supports that (still not sure)​

    El Capitan does support 10-bit
     
  18. digitaldog

    digitaldog Andrew Rodney

    El Capitan does support 10-bit​
    Maybe...

    https://luminous-landscape.com/finally-here-10-bit-lcd-graphic-monitors/​
    We've had high bit displays for years, nothing new here in that old article. What's been missing, and what Apple hasn't specifically told anyone as yet, is does El Capitan support it for all displays that have a high bit path. Some sites have suggested it does, I've seen no proof it does. This was specifically asked on the Apple ColorSync user list, the answer from Apple remains unanswered.
     
  19. EricM

    EricM Planet Eric

    Maybe...​
    "Among the many new features in OS X El Capitan, it seems Apple has silently integrated another one: 10 bit color for the 4K & 5K iMac. Very interesting news for colorists, photographers, and editors.
    A cinema5D reader reported that he got 10 bit on a Mac Pro with D500 graphics and an Eizo CS230 monitor. Also, currently it only works within the Preview and Photos applications. If you want to test it out, you could take a 12-bit RAW photo with soft color gradations and take a look. But it’s also important to note that, for now, no other apps, such as Adobe or other editing software, take advantage of this processing, yet. This is just a preview of what’s to come. For those who have been waiting for this feature for a long time, it’s important news."
    https://www.cinema5d.com/5k-imac-10-bit-color/
     
  20. digitaldog

    digitaldog Andrew Rodney

    1. Among the many new features in OS X El Capitan, it seems Apple has silently integrated another one: 10 bit color for the 4K & 5K iMac. Very interesting news for colorists, photographers, and editors.
    Eric, do you understand the sentence that explicably says it seems?
    A cinema5D reader reported that he got 10 bit on a Mac Pro with D500 graphics and an Eizo CS230 monitor. Also, currently it only works within the Preview and Photos applications
    Yet according to Chris Cox of Adobe, Photoshop has had support for this for awhile. It seems maybe this OS support isn't fully or even partially implemented. SEE if you can find anything from Apple that officially states there's now a true, high bit OS display path.
     
  21. EricM

    EricM Planet Eric

    I'm comfortable with the information and believe it to be
    true. If you feel otherwise, then it's up to you post
    contrary info from Apple. Don't knock yourself out
    though as it means little to me, a Windows user.
     
  22. digitaldog

    digitaldog Andrew Rodney

    As usual you've missed the point Eric. I said maybe and that nothing official has been announced by Apple. Maybe
    means it is possible. You can believe a site that clearly wrote "it seems" but I prefer facts from sources that know not
    guess. As usual, a request that you back up a claim goes ignored. I've got a number of sources looking into this, sources
    you can't call upon. When I get facts, I'll pass actual data onto the group. I'd prefer the facts to be, high bit display IS now
    real in Mac OS! But the so called data you provided clearly illustrated a maybe. Trust but verify: try it sometime.
     
  23. EricM

    EricM Planet Eric

    Flippant, as usual. Please just link us up with facts instead of your wishy washy bs.
     
  24. digitaldog

    digitaldog Andrew Rodney

    Please just link us up with facts instead of your wishy washy bs.​
    Exactly what I requested of you.
     
  25. Andrew, you are putting up a "maybe" against an "it seems", both expressing a degree of incertitude. Serves no purpose.
     
  26. digitaldog

    digitaldog Andrew Rodney

    Andrew, you are putting up a "maybe" against an "it seems", both expressing a degree of incertitude. Serves no purpose.​
    Where's the critical thinking here? I said maybe because it is a maybe; we've heard absolutely nothing from either Apple or Adobe about this possible new feature. Or maybe unlike Eric you can provide data directly from Apple? I very much hope that El Capitan has implemented a high bit video path! I really do. Look at the statement most people are using to suggest this is true:
    "Among the many new features in OS X El Capitan, it seems Apple has silently integrated another one: 10 bit color for the 4K & 5K iMac".
    It seems? It seems the Earth revolves around the Sun, no. It actually does.

    Now examine the rest of the text: 10 bit color for the 4K & 5K iMac.
    ONLY 4K and 5K iMac's? Or any display like my SpectraView that absolutely does support a high bit path?
    As perhaps you know, apparently Eric doesn't, for a high bit video path, every component must support it: OS, video card, video card driver, display and application. So it seems this might be true for the iMac makes no sense or might mean Apple has produced a unique video card or video card driver(they write the drivers) that only works with an iMac? WE DON'T KNOW.... Yet.
    The article states only Preview and Photos works even with that hardware. Odd. Does that mean no other software product supports high bit display path because maybe Apple has introduced something 'new' into those two products that no other 3rd party products yet support? WE DON'T KNOW.... Yet.
    As I stated and will again: where's the critical thinking, the proof? I absolutely hope OS X finally has the necessary high bit support that was the missing link for so many years. But until I hear from someone who can speak with authority, and that isn't Eric by a long shot, I'm going to stick with maybe. Maybe yes (hopefully), maybe no (hopefully not).
    Now again, if you, Eric or anyone else has information that explicitly states this is true, I'm all ears. And again, I've emailed a few folks that DO know the facts including Chris Cox a senior Adobe engineer as well as the Apple ColorSync list. No word back. That's why maybe is the safest language to use at this time.
     
  27. EricM

    EricM Planet Eric

    I like dealing with facts from a reputable source. And here that is, from my contact at Adobe:
    Apple added 30-bit support for 10.11. It only works on certain displays and it works better on their 5K displays (even better on the latest gen iMac).
    The next update for PS will support 30-bit color on Mac.
    http://macperformanceguide.com/blog/2015/20151030_1036-OSX_ElCapitan-10bit.html
     
  28. digitaldog

    digitaldog Andrew Rodney

    Eric, the URL is interesting and useful (thank you) but still doesn't provide full clarity as to what's going on! The Nov 2nd update is telling: I still don't have full clarity from Adobe... and further, the text clearly states assumptions are still ongoing!:
    MPG: The NEC PA series displays have long been 10-bit panels with internal 12 or 14-bit true calibration. So I assume they’ll work grea, even on a 4K UltraHD display*.
    Update 02 Nov: I still don’t have full clarity from Adobe on which displays are supported, or even whether 10-bit-capable displays like the NEC PA series will support 30 bit. Something about dithering when 10-bit is enabled, which would would be a huge disappointment.
    With Apple displays, it’s not clear whether there is any API for true calibration. So calibration solutions on the market may do a lot better with 10-bit video for faux calibration, but that still would not be true calibration.​
    So, we still don't know yet. WHY it is taking so long after the release of this OS is suspect! Again, I hope we finally have high bit support on the OS. Further, which cards will work? Is the graphic card in the new iMac's in some way unique? Can one pop it into any Mac? Are 'special' video card drivers necessary or even available?
    WHO has proof that outside of a new iMac, high bit video support exists? In fact, who has proof it works with the new iMac? I don't have one, seems someone should be able to demonstrate that even with this new hardware, they are indeed producing a true, high bit display path!
    As for: The next update for PS will support 30-bit color on Mac. I'm a beta for that produce, I've asked, no answer. So that too is questionable. FWIW, I know exactly what's coming in the next release and I've seen nothing that states 30-bit support is coming. Maybe!
     
  29. In fact, who has proof it works with the new iMac? I don't have one, seems someone should be able to demonstrate that even with this new hardware, they are indeed producing a true, high bit display path!​
    If you can't see the difference, why would it matter?
     
  30. digitaldog

    digitaldog Andrew Rodney

    If you can't see the difference, why would it matter?​
    One should be able to see a difference, that's the point. Now with a high bit display like my SpectraView, the differences are rather subtle but yes, I can see, using the correct testing methodology, that my setup, even with El Capitan and a high bit panel is not a full high bit path.
    NEC has a Windows utility specifically build so their customers can tell if there's a high bit path. The test file at http://www.imagescience.com.au can be used too. It absolutely IS visible.
     
  31. digitaldog

    digitaldog Andrew Rodney

    Got an email from one person in the know (Product Manager of NEC SpectraView):
    Long answer:
    Yes, but only:
    1. On supported hardware (confirmed so far: ATI FirePro, some of the newer Intel Iris Pro based chipsets).

    2. If the app is capable of requesting a 10 bit surface (so far Adobe apps can't, but I've been told they are in the process of being updated).

    3. Of course the display must be able to support. Luckily there is nothing that needs to be set on the display side.
    What's a bit odd is #2: Got old email and public comments that PS was all set but it's possible a new rub has been introduced, waiting on word back from Adobe. Not sure what "requesting a 10 bit surface" means, looking into that too.
     
  32. digitaldog

    digitaldog Andrew Rodney

    Photoshop was updated today. The Mac version is supposed to now support high bit display path (not that you'll see them admit this anywhere <g>). Those with the proper video cards and displays should test it (trust but verify).
     
  33. digitaldog

    digitaldog Andrew Rodney

    New and somewhat exciting data:
    http://petapixel.com/2015/12/04/adobe-quietly-added-10-bit-color-to-photoshop-cc-heres-how-to-enable-it/
    So much conflicting info. First of all, I was told that for this to really occur, I'd need a video card that supports high bit. The examples I was provided were specific: ATI FirePro, some of the newer Intel Iris Pro based chipsets. I'm using an older MacBook Pro (Retina, Early 2013) with a NVIDIA GeForce GT 650M 1024 MB. I did see banding on the now famous test file with the latest version of CC released last week. But after reading the article and clicking on the 30-bit display check box, banding is gone. So I'm happy but not sure what's going on here [​IMG] . Can't argue with the better results I see on-screen!
     
  34. digitaldog

    digitaldog Andrew Rodney

    Something may not be kosher here. I see NO banding on the MacBook Retina display with the new setting invoked. I do see banding when off. Is the MacBook Retina a high bit panel (news to me)? If not, should we be suspicious that Adobe is doing some kind of dithering here to smooth everything out?
     
  35. EricM

    EricM Planet Eric

    Save yourself all the headaches and use Windows like every other serious digital photographer. For
    someone that is so into colour management, it seems strange that you've anchored yourself to the most
    restricted OS.
     
  36. digitaldog

    digitaldog Andrew Rodney

    Save yourself all the headaches and use Windows like every other serious digital photographer.​
    What a boat load of BS. But then that's to be expected.
    For someone that is so into colour management, it seems strange that you've anchored yourself to the most restricted OS.​
    You really have zero idea of what you're talking about. The bit depth of the video path has nothing to do with color management. As usual, your rants and trolling suggest other's ignore your ignorance. Good job Eric.
     
  37. digitaldog

    digitaldog Andrew Rodney

    Save yourself all the headaches and use Windows like every other serious digital photographer.​
    New low in rhetoric even for Eric. More proof of poster's hypocrisy too. Maybe Eric's traveled south and spent too much time at Donald Trump rallies to tune this rhetoric:
    Eric ~ , Sep 26, 2015; 02:45 p.m.
    Dave, professionals and serious amateurs alike, have been running laptops and external monitors on both Apple and Windows for a number of years now.​
    Is Eric both a serious photographer and Mac users? One item is clear: he's a Mac user:
    Eric ~ , Oct 10, 2014; 02:01 p.m.
    I've only done the MBP and did it with a Samsung Evo ssd. It was easy. The MBP has a normal hard drive and the Air has a flash module hard drive that looks like a ram stick. Those look easy as well and can be bought at OWC​
    Lots and lots of other posts that indicate he's a Mac user.
     

Share This Page