Jump to content

Video card for dual monitors in Photoshop


Recommended Posts

I have seen some positive comments about the Matrox G550 video card

for Photoshop use. Also other comments about dual monitors generally,

that one should use two identical monitors in a dual monitor setup.

(one for the image and one for tools). For my amateur use, I want

only ONE quality monitor for the image, and one cheapo (and small)

for the tools. My internet searching suggests that the G550 will

cater for this. Any feedback from G550 users with a disparity in

monitors would be appreciated.

Link to comment
Share on other sites

You can also do just fine with pretty much any old PCI video card paired with your existing AGP card. Spending alot of money on a Matrox is really only sensible if you current card is no good, or if you plan on using both monitors for critial work. For tools etc there is little point.
Link to comment
Share on other sites

I use a matrox G450 DDR in dual-head configuration, and it's been working great for me (driving a 17" at 1280x1024 32bpp 85Hz and a 21" at 1600x1200 16bpp 75hz with no problem under Windows XP). I've used several matrox cards in the past, and I've been consistently impressed with the quality of the cards and the resulting image quality on high-end monitors, and I wouldn't recommend anything else to anyone who wants to drive CRTs for anything except 3D games.

 

The G550 currently sells for $85 at www.compuplus.com - if you're on a tight budget the G450 LX16 is $63 at the same place.

Link to comment
Share on other sites

I will put another plug in for the Matrox brand. I am using their old G400MAX. I used to have two monitors (both Sony CPD-G400) but left one at my parent's home when I moved to San Francisco for school. The 2D quality is excellent. If you want to play the lastest 3D games, you might want to look at ATI or nVIDIA.
Link to comment
Share on other sites

I paid Can.$300 for a G400 32MB SDR dh when they were still fairly new (about 4 years ago). The image quality and 2d acceleration are excellent. I have consistently experienced stability issues with it in both Me and 2k, causing frequent lock-ups and requiring replacement of the drivers every couple of weeks or so. Perhaps the 550 is better in that regard (the pre-parhelia cards pretty much all suck for gaming, allegedly -- the G400 certainly does, be forewarned about that).

 

Last year I bought a 'clone' ATI 7000VE 32MB DDR dh (ATI chip on a 3d-pty pcb) for Can.$90. The image quality rivals the Matrox, and it seems a lot more stable (though TBH I haven't stress-tested it quite as ferociously). Haven't tried it in dh mode yet, the second output is a DVI and I got a vga adapter for it.

Link to comment
Share on other sites

<P>Image quality, either Matrox or ATI<BR>

( NOT NVidia, they're quicker-for-games but shoddier image-quality

).</P>

 

<P>Dual-head, either newer ATI ( need an adapter, since the ATIs have

one DVI and one VGA connector ), or Matrox.</P>

 

<P>Make sure the output connectors are the correct ones for your

monitors. Dual-DVI may come with adapters to fit to normal-CRT

monitors, but may not... many LCDs have both VGA and DVI, but

some come with only one or the other...</P>

 

<P>Matrox is ultra-high-quality image, but not fast for 3d-graphics

used in Doom or Quake or HalfLife games. ( for me, "who cares?"

)</P>

 

<P>2 monitors is delightful, once one gets the hang-of-it.</P>

 

<P>I choose Matrox, and if I 'ad me druthers would get one of their

P650 cards as fast as can be ( for my stuff ), AND cheap-as-possible for

the performance...<BR>

ATI Radeon 9700 Pro would be roughly equivalent.<BR>

( unless one were a gamer, then the ATI would be absolutely the only

choice )</P>

 

<P>For just messing 'round, though ( no <I>continuous</I>

professional video-editing or professional multi-image-editing

time-constraints ), I'd use a G400 dual-head, or a G450 or G550

dual-head, though the G400's slightly faster in 2D than the G450 or

G550, IIRC.</P>

 

<P>1. check the prices, <BR>

2. make sure AGP version whatever is going to work with your

motherboard ( different voltages involved ), <BR>

3. check the customer-ratings/experience of the reseller you're

considering doing business with ( some very scary reputations out

there ).</P>

 

<P><A

HREF="http://www.pricewatch.com/">http://www.pricewatch.com/</A><BR>

<A

HREF="http://froogle.google.com/">http://froogle.google.com/</A><BR>

<A

HREF="http://www.resellerratings.com/">http://www.resellerratings.com/</A></P>

 

<P>Should one use same-monitors? maybe, but not necessary...</P>

 

<P>Identical <I>resolutions</I>, however, can simplify

software-setup...<BR>

drastically. ( I'm in Linux, using The GIMP <A

HREF="http://www.gimp.org/">http://www.gimp.org/</A> so

different-resolutions means no Xinerama, so I can't have both

monitors share a single continuous-desktop in X, bah! )</P>

 

<P>The thickness of the bezel 'round the screen becomes an issue (

extra-thickness for monitor-attached speakers is idiocy in a dual-head

setup ), and<BR>

... don't get a very-old cheap monitor for the second, or the

magnetic-interference ( pulsing, it looks-like ) induced in your bigger

monitor will give you ( and it? ) migraines.</P>

 

<P>Also, you'll notice the different colour-temperatures of the two

monitors, VERY significantly, so you may want to get a Diamondtron (

NEC / Mitsubishi ) monitor for the good one, I hear the Mitsubishi

versions are wonder-licious..</P>

 

<P>Good Luck.</P>

Link to comment
Share on other sites

What is your opinion on a Mac G4 running a NVDA, GeForce4MX with a LaCie

19blueIII & Blue eye calibrator? How is color consistancy...etc. on this compared with

other comparable vid. cards? Would I be better off running a different card for color

correction, photoshop?

Link to comment
Share on other sites

Good point about checking the AGP of your motherboard, D. Antryg M. Revok. The G-400, while originally advertised as AGP 2X, only works reliably in 1X (and runs in 1X by default). Many newer motherboards will not run anything less than 4X. I personally would not buy another one because of that (I'm running mine in a prehistoric Slot A).
Link to comment
Share on other sites

The comments about certain video cards having better 2-D image quality are absurd and written by PC magazine editors that can't tell subtractive from additive primary colors. Years ago Matrox and Number 9 got a reputation for building high quality graphics cards for imaging work, and the rep has stuck even though I haven't used Local bus video in years, ahem.

 

Matrox, ATI, Nvidia, and the million or so OEM's that build these cards all use off the shelf nickle a dozen D/A converters that vary heavily from OEM to OEM, which eliminates any claims of superior color accuracy from brand to brand. All the newer versions of these cards also feature an array of display tweaks that allow you to adjust saturation and gamma to taste. You should learn to use these controls to your advantage rather than worry about which cards yield better image quality in 2-D mode.

 

I've got a stack of AGP video cards in the corner being I seem to accumulate high end PC parts for no good reason. I have a dual monitor GeForce MX, dual Matrox 450, a couple of 900x Radeon's, and a GeForce FX, and the rest unidentified. I've used them all at one time and really can't tell their 2-D image quality apart with the latest drivers and a solid monitor. How much caffeine I've drank or how long my monitor has been warmed up is a far bigger variable.

 

My own preference is for the AGP TI based GeForce3 cards, which do seem to seperate themselves from the pack in terms of build quality. They seem to be the fastest in 2-D refresh even over the much more expensive cards, and they run in any type of PC, unlike the latest ATI Radeons which are a joke in terms of system compatibility. Nvidia's FX cards aren't a whole lot better on that front. I don't have any issues with the older Matrox cards like the 450 or 550 other than I just don't use them. They are certainly less quirky and more stable than the newer power sucking game cards.

Link to comment
Share on other sites

I use a G550 on my AMD XP1900+ Machine. It effortlessly pushes a 19 inch Sony 420 1280x1024 85 H refresh and a 15 inch Sony LCD 1024x768. I keep my tools on the LCD. Stability is never an issue, the setup is rock solid. XP Pro recognizes the card on clean install and d/l�s the drivers automatically. I followed the usually procedure of adding the card to my previous Win2K system without any problems...truly plug and play. I usually do B&W so I have never noticed a discrepancy in colors on the LCD compared to the CRT. Great card that which, when you filter in the price, is even better.
Link to comment
Share on other sites

<P><I>...The G-400, while originally advertised as AGP 2X, only works

reliably in 1X (and runs in 1X by default). Many newer motherboards

will not run anything less than 4X. I personally would not buy another

one because of that (I'm running mine in a prehistoric Slot

A)....</I></P>

 

<P>I'm using one right now, and it says ( using the linux-command

'lspci -vv' ) that it is AGP 2.0, also it's a 2x/4x card, according to Matrox (

4x doesn't work in this 4.5year-old motherboard, current-status, as it

is running right now, is 1x/2x, so 2x definitely works. )</P>

 

<P><I>...The comments about certain video cards having better 2-D

image quality are absurd and written by PC magazine editors that can't

tell subtractive from additive primary colors...</I></P>

 

<P>NVidia is more concerned with 16-bit ultra-high-speed gameplay,

from the comments of the assembly-programmers who write

game-engines, than is ATI, who're more concerned with

higher-bit-depth image-quality.</P>

 

<P>Quality <I>that is given silicon</I> matters to these people, and it

matters to me.<BR>

Yes I know that one can run any modern video-card in

high-bit-depth, but if it's engineered to run primarily at 16-bit, with a

<I>secondary</I> higher-bit-depth path, then that isn't my primary

image-quality choice.</P>

 

<P>I don't know if they use dithering to 'fake' high-bit-depth while

keeping 16-bit-depth speed as the normal mode, but I wouldn't be

surprised, from the recent stories about gpu's, drivers, 'optimizations'

et al...</P>

 

<P>Also, NVidia is opposed to open-source drivers ( NVidia

motherboard chipsets don't work with stock Linux kernels before

2.4.21 )...<BR>

also, NVidia motherboard chipsets are engineered to be unable to do

Error Correcting Code RAM ( for system-integrity ), wheras all others (

generally: VIA and AMD always... ) can...<BR>

and the 16-bit path is primary in their graphics-chips?<BR>

... I'm sticking with their competitors, thanks: compromising integrity

for speed isn't my preference in a system's basis, or in a partner.</P>

 

<P>Also avoid, at ALL costs, In-Plane-Switching ( maybe IPS/TFT, I

simply don't know yet ) based LCDs, which

are a new technology ( I don't know why it was developed, higher

brightness, or wider-angle-view, or shorter-switching-time of the

pixels, or whatever ).<BR>

Stick with the older-technology, the <B>Twisted

Nematic</B>, or <B>Supertwist</B> TFT, or whatever it's called by

your favourite

manufacturer.</P>

 

<P><A

HREF="http://forums.us.dell.com/supportforums/board/message?board.id=latit_video&message.id=11782&view=by_date_ascending&page=9">VERY

IMPORTANT: look

for IPS, down-a-bit, in this page</A></P>

 

<P>Cheers.</P>

Link to comment
Share on other sites

<i>The G-400, while originally advertised as AGP 2X, only works reliably in 1X</i><P>Back shooter makes a solid point I forgot about, and that's older, first generation video cards not running in newer motherboards due to their slower AGP speed. Case in point, I just popped the Matrox 450 in a KT 400 based motherboard with AGP 4x/8x, and the machine refuses to boot. I tried a TNT2 Ultra in the same machine and it also refuses to boot. Another reason to avoid older video cards with newer hardware.<P> <I>NVidia is more concerned with 16-bit ultra-high-speed gameplay, from the comments of the assembly-programmers who write game-engines, than is ATI, who're more concerned with higher-bit-depth image-quality.

 

<P></I>Uh, says you, and the marketeers at ATI are sure spending a lot of time pushing game benchmarks. Again, the quality of display in 2-D mode with these cards is about 90% the quality of analog circuitry, and strictly related to the quality of the D/A converter unless you are running a straight digital out. Most of your other remarks have to do with DirectX or Open GL filtering, which again has nothing to do with 2-D quality. This is photo.net. and not gamers.net.<I><P>Also, NVidia is opposed to open-source drivers ( NVidia motherboard chipsets don't work with stock Linux kernels before 2.4.21 )...</i><P>Another Linux whiner trying to force a company to make a technical/business decision based on their convenience....a big reason for me to buy Nvidia knowing they are spending R&D on my OS. <P>I personally don't like any newer ATI cards, or newer NVidia card because *all* the brands have ceased having interest in true 2-D quality and are engineering their cards with the absolute interest of game speed. Any of the older cards can suite the task just fine, but then you have to deal with motherboard compatibility.This is why I stick to GeForce 3TI's, or ATI 8500s'. Both tend to run in anything.

Link to comment
Share on other sites

<P>Engineered primary design does .. influence .. function.</P>

<P><I>.. significant issue that clouds current ATI / Nvidia

comparisons is fragment shader precision. Nvidia can work at 12 bit

integer, 16 bit float, and 32 bit float. ATI works only at 24 bit float. There

isn't actually a mode where they can be exactly compared.

<B>DX9</B></I><BR>

[which display happens-through, in MS-Windows]<BR>

<I> and ARB_fragment_program <B>assume</B> 32 bit float

operation, and ATI just converts everything to 24 bit </I><BR>

[slight quality-loss here, going from 4 294 967 296

'colours' to 16 777 216, .. BUT .. if the NVidia driver is

running at 16-bit or 12-bit, because it's internally set to, say by

NVidia's famous 'drivers' ( that have been known to throw-away

entire portions of the image to optimize speed ), then quality-loss is

guaranteed, whether it is noticeable or not <BR>

-- notice that they engineered 12-bit and 16-bit into it for speed, not

image-quality, even though DX9 <I>can't</I> know such bit-depths

even exist,

let-alone whether such bit-depths are running currently]<BR>

<I>. For just

about any given set of

operations, the Nvidia card operating at 16 bit float will be faster than

the ATI, while the Nvidia operating at 32 bit float will be slower...<BR>

--John Carmack</I></P>

<P>Matrox is 32-bit through-and-through, IIRC, and they have no

hope whatsoever of competing in the 'gaming' field, and everyone

knows it. Video-editing and image-editing are their engineered

primary-design, and if using DVI out on a dual-head Matrox, going into

a pair of monitors with DVI-in, then you are .. rather likely .. going-to

notice quality/integrity-difference.</P>

<P>Some hold that working-in-high-bitdepth, end-to-end, is a

contributing factor in getting high-integrity end-results, and having the

displayed-results correctly match the internal results.</P>

<P>..in software, anyways..</P>

<P>Some hold that digital aliasing and the like only happen when using

software, but not when using an i/o device ( printer, scanner, monitor )

.. or perhaps it's only particular i/o devices that are affected?</P>

<P>I'm not a gamer, and if I never see any 'screen-shot' produced by

the .. whatevers .. at 'id Software', where Quake et al

come-from, I'll be

happier: if I want to 'destroy monsters' I'll engage in

system-configuration on a machine some no-integrity-valuing

'sysadmin' built.</P>

 

<P>NVidia doesn't have to pay the Linux-community for the

Linux-community to create drivers

to their products, but won't-allow, <I>on principle</I>, customers

using their paid-for products in ways that don't obey their ?authority (

or whatever it is ).</P>

 

<P>Just as they seem to be the only chipset makers who don't

<I>permit</I> ECC to be part of motherboards based on their

memory-controller designs ( AMD and VIA I'm certain about, Intel and

SiS I'm not-certain about ).</P>

<P> I'm pleased your profoundest heart-determination is declared so

openly and so clearly, and I hope to eradicate my soul/continuum

from your 'reality/world' so you can make the world obey your mode

entirely, without interference, and be eternally and entirely

happy.</P>

<P>I'll continue evolving hidden-heart meaning, though, since

alternatives are, to me, inferior.</P>

<P>I hope you live very much longer than I do.</P>

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...