Jump to content

Do I need a video card to work in 16 bit?


Recommended Posts

Awhile back I started working in 16 bit via PS CS. My setup is a G4 1.25 dual with a

calibrated 17" Viewsonic CRT. With 16 bit images I get an obvious stairstep quality to

edges at lower screen maginifcations up to about 25%, which then goes away as the

magnification increases. I don't get this efftect with 8 bit images. Does this have to do

with the need for a 16 bit video card, which someone said I needed, or perhaps the

monitor itself? I don't consider this as a big problem, just an annoyance at times.

Link to comment
Share on other sites

It would seem that the factory installed video support, an ATI Radeon 9000 Pro with

64MB DDR SDRAM, should do the job? I know next to nothing about video cards,

hence the confusion on my part in regard to advice I was given. Can anyone fill me

in?

Link to comment
Share on other sites

<p>I don't know about the Mac world, but in the PC world, few if any video cards can display more than 8 bits per channel. But PC display settings refer to the total number of bits per pixel, rather than bits per channel, so to get the full number of colours, the display driver needs to be set to 24 (8 bits per channel, with red, green, and blue channels) or 32 (sometimes done to provide alpha or Z channels, but often done as a more efficient way of organizing video memory) bits per pixel. If you set the display driver to 16 bits, you're telling it to divide those 16 bits between three channels, so you usually end up with 5 bits per channel, either with one bit wasted or one extra bit allocated to green (since human vision is most sensitive to green).</p>

 

<p>Setting the display driver to too low a number of colours would definitely cause posterization, but it should happen to any susceptible image at any magnification and at any bit depth, which doesn't sound like your issue.</p>

 

<p>It won't be the monitor.</p>

Link to comment
Share on other sites

<p>Some do, but it's awfully hard to find information on the actual display bit depth various cards use. Most video card/chipset manufacturers don't provide this information; I can't find any Intel or Nvidia documents about it, nor does ATI seem to provide this information for most of their recent cards (though they do say, in a couple of different documents, that with their older Rage products, there is absolutely no difference in colour depth between 24- and 32-bit modes, and <a href="http://www.ati.com/products/pdf/ImageQualityWhitePaper.pdf">the same is true for the Radeon 7500</a>). Many third-party reviews also don't cover this information, either.</p>

 

<p>In some cases, 32-bit display depth and 24-bit display depth both provide 8 bits per pixel; the 32-bit setting helps performance by aligning the data on doubleword boundaries (since almost everything can handle data 32 bits at a time, and almost nothing natively handles data 24 bits at a time).</p>

 

<p>In the specific case of the card mentioned for this particular Mac, it's 8 bits per pixel; <a href="http://www.ati.com/products/radeon9000me/radeon9000prome/specs.html">ATI claims no colour depth greater than 16.7M</a>, which is 24 bits.</p>

Link to comment
Share on other sites

Sounds like I've wandered into the techno swamp. All I know for sure is that my

monitor preferences are set to millions of colors, and my monitor controls don't show

anything about setting different display sizes in regard to bits. If it isn't broken ...

 

To those who jumped in here, would you guess that the stair stepping effect I see

has to do with working in 16 bit with my current video card, or something else?

 

I did a google search for 16 bit video cards, and video cards for 16 bit image editing,

and came up with a bunch of info that went over my head and didn't sound like it

pertained to the issue at hand.

 

Thanks everyone! I'll do a little more research and see what evolves.

Link to comment
Share on other sites

For digital photography 8-bit vs 16-bit does not represent and increase in color but rather

an increase of color accuracy. Your computer will only display 8-bit color (0-256 in R,G,B

for 16.7 million color combinations) when looking at a 16-bit image. 16-bit images give

better accuracy (not color range) because you have 3x the information to work with. All

printing (and viewing) is done in 8-bit mode. For more info check out <a href="http://

www.earthboundlight.com/phototips/8bit-versus-16bit-difference.html"> 8 vs 16 bit </

a>

Link to comment
Share on other sites

</a>

 

<p>Hopefully the page works now ... the 16-vs-8 bit link wasn't closed properly so the reply button didn't work.</p>

 

<p>You shouldn't get stair-stepping due to the number of colours, unless you've set it ridiculously low. Millions of colours should be fine. I don't know why you get this effect. Are your display drivers, Mac OS, Photoshop, etc. all up to date?</p>

Link to comment
Share on other sites

As this is only occuring at <b>lower screen magnifications</b> and since <b>one can only judge what the data truly looks like at 100% magnification</b> what you are seeing is interference between the on screen interpolation algorithm and the use of 16 bit data. After all, banding at lower magnifications tells you things about the on screen interpolation algorithm and not about the look of the actual data.<p>

 

I personally have not experienced this problem (I run PS CS on XP), but since PS only ships 8 bit color to the display (without special hardware and drivers), this instead points to a display problem at low magnifications. As you are seeing this only with 16-bit color images, this points that there may be an interpolation issue that exists at lower magnifications with 16-bit color on a Mac. <p>

 

That said, you should check three things.<p>

<ul>

<li> Take an image you have this issue with, convert it to 8-bit color, and then see if this is truly your problem, or if your current shooting subjects have this problem.

<li> Check <b>Edit->Preferences->General</b>. What interpolation setting are you using. I have <b>Bicubic Smoother</b> selected.

<li> Ensure that your display is using 8-bit color which is commonly denoted as <b>True Color</b>, <b>24-bit color</b>, <b>32-bit color</b> (the extra 8-bits are the alpha channel), or <b>Millions of Colors</b>.

</ul>

 

There are many possible sources of trouble here, and until the problem itself is isolated, throwing money at is is silly (and likely wasteful).<p>

 

hope this helps,<p>

 

Sean

Link to comment
Share on other sites

Hi Sean, thanks for the input! I did a quick test and the results are the same. I

converted a 16 bit file to 8 bit and the stair stepping looked the same, in the 16-33%

range. Also, I changed my preference for bicubic to bicubic smoother and there was

no change. It's not hard to live with and I always check results at 100%, and when I'm

viewing at a lower magnification and see the stair stepping I increase maginfication to

the next level and it generally goes away. In fact, the stair stepping seems to

alternate from one magnification to the next as it increases, up to about 33%. All in

all, there are worse things to live with!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...