Jump to content

Recommended Posts

Personally I prefer CRT's. I have a 19" Philips that was pretty expensive about 3 years ago considering that CRT's were getting cheaper. I paid about 370 for it and it was the best CRT that I saw. Even as of today I have not seen many CRT's better than my phillips. Maybe the blue Lacie displays would be better or the top of the line sony monitros. Even now days those cost about $700 or so. They are big but they will be best than any LCD, at least that is what I have seen.

 

One thing is in mind though... Don't CRT's display more colors, more contrast than prints? If this is the case, should we really care to use such a high end display?

Link to comment
Share on other sites

It's my understanding that LCDs still seem to shift a bit when you move off-axis; lighten / darken / shift color / change contrast. I know they're better than they were a couple of years ago, and for all I know, I may be living in the past. I haven't done side-by-side comparisons. For now (meaning "until it breaks" probably) I use a CRT. I plan to get an LCD as a 2nd monitor but will use it for toolbars and such.

 

Of course, then I'll put images on it and compare to the CRT...

Link to comment
Share on other sites

LCDs have improved over the years, and I prefer mine over my CRT in most respects. For

one, CRTs may have become flatter, but they are not flat and they produce distortions in

the shape of the image that shift frequently. In other words, CRTs cannot consistently

reder accurate shapes. LCDs are sharper, making them helpful for retouching. LCDs do

not reflect substantial amounts of ambient light, so you are generally fine without an ugly

hood. Subjectively, my LCD produces more vivid images. Objectively, my monitor

calibrator measures a wider color gamut on my LCD, but a larger contrast ratio on my CRT.

LCDs are brighter, and you can vary the brightness without losing contrast. LCDs use 1/3

the power of same sized CRTs, and they produce far less heat. CRTs can become thrown

off my environmental factors such as magnetic fields (not an enormous issue, but it can be

irritating).

<p> On the other hand, LCDs shift color with the viewing angle, although they have

improved, and I have not found this to interfere with image editing. CRTs can also

squeeze in greater resolution, although the flicker rate will slow, possibly inducing nausea

and headaches.

<p> Ergonomically, there is really no comparison, and anybody who spends much time

in front of the screen would be foolish to overlook this. LCDs are light and easy to adjust,

cause less eye strain due to the flatness and sharpness, and do not produce a flicker.

<p>

"One thing is in mind though... Don't CRT's display more colors, more contrast than

prints? If this is the case, should we really care to use such a high end display?"

 

<p> A monitor may display more colors and contrast than a print, but the color gamut

does not include many colors found in prints. In other words, for printing, you need to

judge how much of the printer's color gamut the monitor can produce.

Link to comment
Share on other sites

I work with both LCD's( Apple cinema display) and CRT's - for critical image and colour evaluation I much preffer CRT. It's smoother and cleaner - LCD's have a fine granular texture to the screen which gets in the way of the image - it's like a layer of condensation or frost which is quite anhoying. The better LCD's don't change colour with veiwing angle but they are still not as tollerent of off axis veiwing as CRT. For Pre press use I also find LCD's too contrasty and over bright. They are nicer to look at for long periods though and for non critical work they are a better for that reason alone.
Link to comment
Share on other sites

I bought an LCD because I spend roughly 14hrs a day, sometimes more, in front of a computer - CRT's were literally making me blind. The LCD is *so* significantly easier on my eyes that there's just no comparison. Since I've just been into image editing this year, I can't compare between CRT and LCD. However, I find the image to be sharper in general on my LCD than any CRT I've seen so far. The color shift when you change angles is very slight on my monitor (a Dell branded flat panel) and I find the color rendering to be more vivid. My reasons for buying one had more to do with serious eye strain, but I've been perfectly happy with image editing/display. Good luck :)
Link to comment
Share on other sites

Most LCD monitors have very poor skin tones, even if you turn down

contrast. However they are quite good for nature shots, with

excellent greens. Yes, aperture grille (not necessarily Trinitron)

is considered superior to shadow mask CRT. Probably it's best to have

a 2-monitor setup with both LCD and CRT.

Link to comment
Share on other sites

"Most LCD monitors have very poor skin tones..."

 

I don't know about "most" LCDs; I'm sure this is true of some of them, and may have been generally true at one time, but no longer. The better LCD panels available today produce excellent skin tones.

 

Last week, my niece took a picutre of herself and her younger brother, using my wife's P&S digital, holding the camera in front of their faces at arms length, and firing (with flash!). The skin tones, when viewing the image on my Samsung 193P, are fantastic.

Link to comment
Share on other sites

The Sony Trinitron is an aperture grille monitor. This type generally has a crisper, contrastier and more saturated image than shadow mask monitors.

 

For years the Trinitron was probably the best but I think it's been eclipsed by the Mitsubishi aperture grille CRTs. Mitsubishi was the first to diminish the appearance of the horizontal stabilizing wires that were a little distracting on the Trinitrons.

 

Some NEC monitors also use Mitsubishi aperture grille CRTs and are occasionally a little less expensive, tho' the Mitsubishi brand monitors are reasonably priced.

 

I haven't bought an LCD display because every time I've tried to get used to them in the stores (for several years now), I just find them annoying. They look unnatural to me. There's no other choice with notebooks but for now I can't see replacing a desktop CRT with an LCD.

Link to comment
Share on other sites

I don't believe CRT is significantly better, if at all, but even if it were you should still be

using LCD for safety reasons:

 

1. Many people get headaches from 60Hz CRT's and some (including me) can see the

flicker. This is the frequency the much-vaunted Trinitrons-on-Macs ran at in the bad old

days. Now you can expect 75Hz, which is still too low for some. If you're buying the

monitor for an employee, you'd want 85Hz minimum. Good luck finding this at high res.

 

2. CRTs charge the air in front of them. (The degaussing coil works only at intervals - you

can hear it go.) It has been shown that bacteria, dust, skin particles, cat dander etc. get

charged and float across to your face. This causes skin infections.

 

3. They were a leading cause of residential house fires in Australia in the 1990s, just

behind gas "space heaters" which have a large installed base in this country.

 

It's human nature to ignore the risks of something ubiquitous. Tobacco and asbestos came

to mind.

 

Beyond the safety aspects, you should be looking at a DVI interface and high resolution.

and there probably aren't CRTs with such an interface�.

Link to comment
Share on other sites

To address the skin tone issue -- my particular LCD has terrific skin tones. You're likely to find (if you haven't noticed it in this thread alone! <g>) that there's no solid and agreed upon answer. It's the Canon/Nikon, Democrat/Republican argument - no matter what you choose, half the ppl out there will disagree with you :) LCDs are very good these days. CRTs are too. If one fits your budget better than the other, or your available space, your design aesthetics, technology fetish or biological imperatives - go for that one, whatever it is, and you're likely to be very happy!
Link to comment
Share on other sites

Like most everything in life, the more you pay, the better the equipment becomes. If you're comparing low end LCD's to CRT's the CRT wins every time. The LaCie Electron Blue CRT monitor - which can be had for about $370 - is a great buy and will be far more accurate than most low to mid priced LCDs.

 

If you look at high-end LCD's ($1800 and above), the LCD's work every bit as well as a CRT. The comments about off-axis viewing, color gamut, contrast etc. don't apply to the top LCD's - they just flat out work for photo / graphics editing. The Apple Cinema displays are the lowest priced LCD's that truly work for photo editing. The higher priced screens from Eizo look even better.

 

There are some mid-priced LCD's ($750-$1200) from Samsung and Dell that are getting close to the high-end screens and many people find them adequate. I looked at several LCD's before purchasing my Eizo. However, once you see the Eizo, work with it's control software, and calibrate it - there's no going back.

 

If your budget doesn't allow you to purchase an upper end LCD, stick with a CRT.

Link to comment
Share on other sites

I agree with swinehart.

 

 

You cannot compare low end with high end monitors.

 

For those who say that CRT's cause flicker is only because they are cheap and their video cards are cheap. My monitor can resist 85Mhz at 1152 x 864 with out any problems. I can go higher but the icons start to be too small to please me. So there is no visible flicker at all.

 

I really do not know if an expensive LCD can be sharper than an expensive CRT. I do not belive it but then I do not have proof to deny it.

 

If your images look to you more saturated because of the LCD you are using may affect your visual image editing because maybe the oversaturated image that you are seeing on your LCD is not really the image that you have so if you send it to somebody else, they will not see what you see. That is why you need a high end well calibrated monitor to really see a precise image of what you really have captured on film or sensor.

 

Not just because your image looks more vivid on your screen means it's better.

Link to comment
Share on other sites

Wow.  There is so much misinformation, half-truth, and Old Wives' Tales, all presented as Gospel, in this thread that I hardly know where to begin.  I am reminded of nothing so much as the old fable about Three Blind Men and the Elephant.

<br>

<br>

First, to the OP:

<br>

<BLOCKQUOTE><I><B>Frankie Frank , may 26, 2005; 05:08 p.m.</B>

<br>

Compared LCD to CRT monitor for photo editing, which one you prefer?

</I></BLOCKQUOTE>

 

It depends almost strictly on the particular monitors in question.  There are good CRTs, and there are crappy CRTs; similarly, there are good LCDs, and there are crappy LCDs.  Any attempt to lump them all together and declare one type of technology inherently superior to the other is doomed to failure -- and to the sort of voodoo computing "advice" you see so much of in this thread.  You might nearly as well ask, "Which is better, Nikon or Canon?"

<br>

<br>

That said, there are a few (near-)constants:

<ul>

<LI>

CRTs are generally more flexible in terms of the video display modes they will properly support than LCDs are.  An LCD is really useful ONLY at its native resolution, for example.

</LI>

<LI>

Many (perhaps "most"; but I won't go so far as to say "all", simply because I haven't seen them all) LCDs are relatively limited in the number of different colors they can produce; often, this limit is as low as 65,536 (64K, or 2^16) colors, far short of the 16,777,216 (16M, or 2^24) or 4,294,967,296 (4B, or 2^32) colors we tend to take for granted.  A standard whatever-VGA CRT monitor uses analog inputs and signal amplifiers; hence it can produce an *infinite* number of colors, limited only by the video card driving it.  While this is usually not an issue for general-purpose "virtual desktop" use (where 64K available colors is more than plenty), it's easy to see how it could impact critical photo-editing applications.

</LI>

<LI>

For approximately equivalent performance, resolution and screen size, you can expect to pay significantly more (at least two to three times as much) for an LCD than for a CRT.  This is especially true in the "mid-range" market most folks would presumably be shopping in.  There are at least two reasons for this: One, LCDs are "hot" right now in the general computing marketplace; and basic economics tells us that more demand *always* equates to higher prices.  Second, while production of desktop LCD displays has greatly increased over the past couple of years, they are still MUCH more complicated and difficult to manufacture with consistent quality, and the "yield rate" is still nowhere near as good as with CRTs.  Yes, retail prices have definitely dropped vis-a-vis where they were two years ago (especially in the "medium-large" sizes such as 18-19"); but they still have a long way to go to be *really* competitive with CRTs.

</LI>

<LI>

It is often (tho' perhaps not always) more difficult to properly calibrate an LCD monitor's color rendition, because the light source itself (the fluorescent lamp behind the screen) has it's own color cast, as opposed to the CRT where the phosphor coating generating the color *is* the light source.  This is also why it is next to impossible to get really black blacks *and* really white whites out of the same LCD with the same set of adjustments.

</LI>

</ul>

Like I said, these are generalizations; but (unlike a lot of the "stuff" posted to this thread) they do tend to hold up.

 

<BLOCKQUOTE><I>

Among the CRT monitors, is Trinitron the best?

</I></BLOCKQUOTE>

 

In a word, NO.

<br>

<br>

"Trinitron" is Sony's trade name for aperture-grille CRTs -- a technology they developed for broadcast television applications (where it does a reasonable job of disguising some of that medium's inherent weaknesses) over 30 years ago, then pressed into service for computer monitors -- primarily because they already had it "on the shelf", _not_ because it was really all that well-suited to this very different application (which it isn't -- see below).  Sony's "steamroller" marketing, however, has since made a major chunk of the uninformed general public think it is some sort of panacea (which it also isn't).  Think of it as their revenge for losing the Beta vs. VHS war. <~>

 

<BLOCKQUOTE><I>

LCD is quite gentle on the eyes, but do they give good accurate colors and shadows?

</I></BLOCKQUOTE>

 

See above.

<br>

<br>

OK, moving on to some of the bad advice you were given...

<br>

<BLOCKQUOTE><I><B>Brian Ball , may 26, 2005; 06:36 p.m.</B>

<br>

LCDs have improved over the years, and I prefer mine over my CRT in most respects. For one, CRTs may have become flatter, but they are not flat and they produce distortions in the shape of the image that shift frequently. In other words, CRTs cannot consistently render accurate shapes.

</I></BLOCKQUOTE>

 

You need to brush up on your geometry -- or Logic 101.

<br>

<br>

If, in both cases, you sit with your eyes directly perpendicular to the center of the screen (which you should, for any of several reasons), any geometric distortion produced by the CRTs face curvature will be minimized to the point of insignificance, because the variances (which aren't very large to begin with) will be solely on the Z (depth) axis.  Is your eyesight especially poor in terms of depth-of-focus?  That's about the only way I can envision the very slight differences in eye-subject distance having any discernable impact.  In any event...  If, under these conditions, you can't get really square squares and really circular circles out of your rig -- or if whatever distortions are there *ever* "shift", as you put it -- something is *grossly* out-of-whack.

 

<BLOCKQUOTE><I>

LCDs are sharper, making them helpful for retouching.

</I></BLOCKQUOTE>

 

Define "sharper", in this context.

<br>

<br>

In terms of pure resolution, most CRTs can out-perform most LCDs by a significant -- nay, *wide* -- margin, particularly at anything like comparable cost.  (I am deliberately ignoring here the occasional insanely expensive aberration such as the 30-inch Apple Cinema Display).

<br>

<br>

If you are referring to the apparent "sharpness" of each individual pixel, there should also not be much to choose between them, unless at least one of the monitors (probably the CRT) is grossly out of adjustment.  Many (perhaps most) folks tend to jack up the Brightness and Contrast controls *way* beyond where they really ought to be set for optimum performance.  This can (and often does) cause "bloom" on CRT-based displays.  But place the blame for this where it belongs -- on the user (and especially on retail vendors who *habitually* do this), not on the CRT itself.

 

<BLOCKQUOTE><I>

LCDs do not reflect substantial amounts of ambient light, so you are generally fine without an ugly hood.

</I></BLOCKQUOTE>

If you need a hood to kill reflections off a CRT's face, your ambient lighting is too harsh (and probably *way* too bright -- banks of fluorescent tubes in the ceiling, perhaps?).  Sure, that sort of over-kill lighting is pretty much S.O.P. in modern commercial offices; but that doesn't mean it's right.  Fix the *real* problem, instead of trying to apply inherently imperfect band-aids.

 

<BLOCKQUOTE><I>

Ergonomically, there is really no comparison, and anybody who spends much time in front of the screen would be foolish to overlook this.

</I></BLOCKQUOTE>

 

Again, this depends greatly on the individual CRT and the specific setup of your workspace.  If the workspace is *properly* set up to accommodate it, even a massive 21+ inch CRT can be quite "easy to live with".

 

<BLOCKQUOTE><I>

LCDs are light and easy to adjust,

</I></BLOCKQUOTE>

 

True, in general.

 

<BLOCKQUOTE><I>

cause less eye strain due to the flatness and sharpness,

</I></BLOCKQUOTE>

 

Maybe, at best.  Beyond the issues discussed above, too much depends on the specific models being compared to make this a sweeping generalization.

 

<BLOCKQUOTE><I>

and do not produce a flicker.

</I></BLOCKQUOTE>

 

Most definitely <B>*NOT*</B> true!

<br>

<br>

The light source for all common desktop LCD displays is one or more small fluorescent tubes placed behind the LCD itself.  These fluorescent tubes are (at least usually) driven off the 60Hz AC mains, just like the ones in your office ceiling; hence, they are subject to the same flicker issues as their larger brethren.  And yes, these issues are MUCH more pronounced than the so-called "flicker" produced by a CRT's raster scanning.

 

<br>

<BLOCKQUOTE><I><B>Bill Tuthill , may 26, 2005; 10:36 p.m.</B>

<br>

Most LCD monitors have very poor skin tones, even if you turn down contrast.

</I></BLOCKQUOTE>

 

Only if they (or the systems and/or video cards driving them) are seriously mis-adjusted.  There is nothing about either CRTs or LCDs which would give one an inherent advantage over the other on "skin tones" (as opposed to any other sort of color rendition).

 

<BLOCKQUOTE><I>

However they are quite good for nature shots, with excellent greens.

</I></BLOCKQUOTE>

Again, this depends on the individual monitor, and especially on how it (and the system/card driving it) is adjusted.

 

<BLOCKQUOTE><I>

Yes, aperture grille (not necessarily Trinitron) is considered superior to shadow mask CRT.

</I></BLOCKQUOTE>

 

Only by folks who have been seduced by marketing hype, as opposed to really understanding the underlying technology.

<br>

<br>

By virtue of their fundamental design, aperture-grille CRTs inherently have poorer resolution (especially in the vertical axis) than their conventional shadow-mask cousins.  This typically makes interlaced NTSC broadcast signals (for which the "Trinitron" CRT was originally designed) look great, because it masks some of the underlying problems of that crappy signal (and secondarily, because it makes misconvergence -- long the main bugaboo of consumer TV sets, but far less of an issue now than it was even 5-10 years ago -- less debilitating).  But it is *NOT* what you want for high-quality static displays such as image rendering on a desktop PC.  Do not be fooled by bogus "specsmanship" which trots out dissimilarly measured "dot pitch" (which is really a misnomer in the context of aperture-grille CRTs, which do not have "dots", per se) specs to show aperture-grille CRTs as competitive with conventional shadow-mask types.  The numbers *cannot* be directly compared, precisely because they are measuring different things.  As a ROUGH approximation ONLY, a 0.25mm AG pitch is "sort of" equivalent to a nominal 0.30-0.32mm dot pitch in the horizontal direction ONLY; it is much worse than that vertically (at least as these units are typically adjusted and used).

<br>

<br>

Note, all of this is with respect to "conventional" shadow mask CRTs.  But the reality is that over the past five years or so, a much-improved "second generation" of shadow-mask CRTs (first developed by Hitachi) which offer superior geometry to "conventional" CRTs have become nearly ubiquitous.  Meanwhile, aperture grille development has remains more-or-less static.  So the comparison is really even more striking than I allude to above.  The following page explains some of this in more detail, if you're interested: <A HREF="http://www.hitachidigitalmedia.com/DMG/tech_monitors.jsp">http://www.hitachidigitalmedia.com/DMG/tech_monitors.jsp</A>.

 

<BLOCKQUOTE><I>

Probably it's best to have a 2-monitor setup with both LCD and CRT.

</I></BLOCKQUOTE>

 

Presuming you mean to display the image being edited on the CRT, and the rest of the "virtual desktop" on the LCD, this is not a bad approach, if only because it tends to save desktop real estate as compared to using two CRTs; but it's not really necessary.

 

<br>

<BLOCKQUOTE><I><B>Lex (perpendicularity consultant) Jenkins , may 27, 2005; 05:05 a.m.</B>

<br>

The Sony Trinitron is an aperture grille monitor. This type generally has a crisper, contrastier and more saturated image than shadow mask monitors.

</I></BLOCKQUOTE>

 

Which is not to say "superior".  Over-saturation is in fact one of the chronic problems of AG-CRT based monitors; but this tends to have at least as much to do with marketing as it does to the underlying technology.

 

<BLOCKQUOTE><I>

I haven't bought an LCD display because every time I've tried to get used to them in the stores (for several years now), I just find them annoying. They look unnatural to me. There's no other choice with notebooks but for now I can't see replacing a desktop CRT with an LCD.

</I></BLOCKQUOTE>

 

At least most of the time, retail stores are THE WORST possible place to judge a monitor's quality.  Not only is it usually a very "monitor unfriendly" environment -- way too much ambient light, usually produced by scads of fluorescent tubes (or worse, mercury-vapor discharge bulbs; i.e., _street_lamps_, fer crissakes!) in the ceilings; and usually an absolute cesspit of both RFI and EMI -- the monitors themselves are nearly always *grossly* misadjusted by the idiot sales staff in an effort to produce "punchier" displays, to the severe detriment of realistic *accurate* rendering.  The big mass-marketers (CompUSA, et al) also very often commit the cardinal sin of driving a bank of 10-20 monitors off the same PC via a signal-splitter / distribution amplifier, which nearly always *thoroughly* screws up the signals being fed to ALL of the monitors.  Bottom Line:  Attempting to make image-quality judgements in such an environment is self-defeating, at best.

 

 

<BLOCKQUOTE><I><B>Frank Oddsocks , may 27, 2005; 05:19 a.m.</B>

<br>

I don't believe CRT is significantly better, if at all, but even if it were you should still be using LCD for safety reasons:

</I></BLOCKQUOTE>

 

"Safety reasons"?!?  Please.  I've heard some crazy things put forth to justify an arbitrary opinion; but this is surely one of the most bizarre ones ever.

 

<BLOCKQUOTE><I>

1. Many people get headaches from 60Hz CRT's and some (including me) can see the flicker. This is the frequency the much-vaunted Trinitrons-on-Macs ran at in the bad old days. Now you can expect 75Hz, which is still too low for some. If you're buying the monitor for an employee, you'd want 85Hz minimum. Good luck finding this at high res.

</I></BLOCKQUOTE>

 

Refresh rate has got to be the most over-hyped and misunderstood monitor "spec" to ever have been given far more importance by the general public than it could possibly be worth.  Notwithstanding the occasional semi-epileptic freak of nature, it is HUMANLY IMPOSSIBLE to detect "flicker" at refresh rates even HALF that high.  There are two very good reasons for this...

<br>

<br>

First, there is the matter of the persistence of the CRT's phosphors.  Once excited by the electron beam, the phosphor will continue to glow for some significant period of time after that excitation is removed.  This is the same principle as what makes the numerals on your old-fashioned "glow in the dark" wristwatch face light up.  Phosphor persistences vary from "very short" to "very long" -- and the latter (such as are sometimes used in oscilloscopes and such) can continue to glow brightly for _several_seconds_ after the beam has been completely removed.  Ergo, it is absolutely meaningless to discuss "flicker" as purely a function of the refresh rate, without also determining the persistence of the phosphor used on a particular CRT.  Having said that, I'll also note that, IN GENERAL, even the "medium-short" persistence phosphors commonly used on current computer monitor CRTs are MORE than adequate to cover refresh rates FAR lower than we are discussing here.

<br>

<br>

Second, even ignoring the persistence issue, even the LOWEST refresh rates used by modern monitors are well above the threshold of human perception.  Don't believe me?  OK...  Go to the movies some time.  Do you see "flicker"?  The "refresh rate" (so to speak) in a commercial theater is 24 frames per second -- and this is using pure light (which, for all intents and purposes, has NO "persistence").  It follows then that if the difference between, say, a 60Hz refresh and a 75-85Hz refresh were significant, you would find the movie presentation downright unwatchable.

<br>

<br>

Beyond all of this, if you still think "flicker" is a significant issue, you will be running away from *all* desktop LCD monitors as fast as you can, because they ALL produce MUCH more "flicker" than any CRT-based monitor, because the light source is actually an AC-driven fluorescent lamp -- which, like the theater projector's mechanical shutter, has *NO* "persistence" to cover up the flashing!

 

<BLOCKQUOTE><I>

2. CRTs charge the air in front of them. (The degaussing coil works only at intervals - you can hear it go.) It has been shown that bacteria, dust, skin particles, cat dander etc. get charged and float across to your face. This causes skin infections.

</I></BLOCKQUOTE>

 

Man, you are *really* reaching.

<br>

<br>

To the extent that such "charging" actually occurs, the net effect is to attract the various nearby stray atmospheric guck to the surface of the screen -- where it STAYS until you forcibly remove it via a solvent and a cleaning rag.  This is why your monitor (and your TV screen) seems to get dirtier, faster, than most everything around it.  Hence, the net "health" impact -- while still surely insignificant, at least to most folks -- is precisely the opposite of what you claim.  After all, whatever gets stuck to the screen *was* floating around in the atmosphere in the immediate vicinity of your workplace, until the monitor "grabbed" it out of the air and thus put it out of reach of your lungs, skin, etc.

 

 

<BLOCKQUOTE><I>

3. They were a leading cause of residential house fires in Australia in the 1990s, just behind gas "space heaters" which have a large installed base in this country.

</I></BLOCKQUOTE>

 

Was a large batch of particularly defective monitors shipped "down under" at around that time?  That's about the only basis I can think of for your assertion to be meaningful.

 

<BLOCKQUOTE><I>

It's human nature to ignore the risks of something ubiquitous.

</I></BLOCKQUOTE>

 

It's also human nature to pull things out of context and (invalidly) attempt to imply causal relationships based on nothing more than statistical correlations or even simple coincidences.  CRTs (in both computer monitors and TV sets) are in fact *so* ubiquitous that it would take a positively HUGE number of such incidents (probably in the tens of millions) to really be statistically significant.  Had that actually happened -- ANYWHERE -- I think we'd have hard about it, ad nauseum, through the major media.

<br>

<br>

Whew!  Sorry, folks, for the length of this screed; but like I said, there was a LOT of misinfomation to correct.

<br>

<br>

 

Link to comment
Share on other sites

"...because it makes misconvergence -- long the main bugaboo of consumer TV sets, but far less of an issue now than it was even 5-10 years ago..."

 

Well, when talking about mis-information. Let's look at this statement as a prime example. The Sony monitors were (are) known for a patented design call the Trinitron gun. This is a single RGB gun that does not go out of alignment. In fact, it's not possible to align the RGB guns in a Sony monitor. The Trinitron gun also provides better spot focus with less distortion at the edges or corners of the screen.

 

But, this has nothing to do with the aperture grille shadow mask - it's a gun design. The aperture grille mask allows more light to reach the screen. The aperture grille is used to de-accellerate the electrons, and the design allows more electrons to reach the phosphors making a much brighter picture. That was the main advantage of the aperture grille design for home television receivers - a brighter picture.

Link to comment
Share on other sites

<BLOCKQUOTE><I><B>steve swinehart , may 31, 2005; 02:22 p.m.*</B>

<br>

"...because it makes misconvergence -- long the main bugaboo of consumer TV sets, but far less of an issue now than it was even 5-10 years ago..."

<br>

<br>

Well, when talking about mis-information. Let's look at this statement as a prime example. The Sony monitors were (are) known for a patented design call the Trinitron gun. This is a single RGB gun that does not go out of alignment. In fact, it's not possible to align the RGB guns in a Sony monitor.

</I></BLOCKQUOTE>

 

Oh, really? Then why does Sony include "convergence" controls on its Trinitron-based monitors (cf.

<A HREF="http://www.sony.com.hk/Electronics/cp/gdm-f520.htm">http://www.sony.com.hk/Electronics/cp/gdm-f520.htm</A>, hmmmm?

<br>

<br>

Try again.

 

<BLOCKQUOTE><I>

The Trinitron gun also provides better spot focus with less distortion at the edges or corners of the screen.

</I></BLOCKQUOTE>

 

Maybe.  Maybe not.  Depends on the *specific* monitor(s) under discussion.

 

<BLOCKQUOTE><I>

But, this has nothing to do with the aperture grille shadow mask - it's a gun design.

</I></BLOCKQUOTE>

 

Correct; but it is also an integral part of the "Trinitron" design class, so it remains a valid point of discussion when comparing aperture-grill CRTs to shadow-mask CRTs.

 

<BLOCKQUOTE><I>

The aperture grille mask allows more light to reach the screen.

</I></BLOCKQUOTE>

 

Uhhh...  No.

<br>

<br>

*NO* light "reaches the screen" (from the inside, anyway) in either aperture-grill CRTs or shadow-mask CRTs.  You're obviously thinking of the electron beam; but that's not light -- and "more" of it is *not* what you want, if high resolution and precise rendering are your goals.  This in fact is one of the fundamental problems of the aperture-grill design:  Since there are no mask-induced "borders" to the pixels in the vertical dimension, if the electron beam has enough energy to strongly excite (i.e., brightly illuminate) the phosphor where you do want it, it inescapably also has enough energy to "bleed over" into adjacent areas where you *don't* want it.  Hence my earlier comments regarding AG-CRTs' abysmal vertical resolution performance.

 

<BLOCKQUOTE><I>

The aperture grille is used to de-accellerate the electrons,

</I></BLOCKQUOTE>

 

Uhhh...  No.  Again.

<br>

<br>

The aperture grille performs exactly the same function as a shadow mask, just not as well.  That function is to prevent the electron beam from exciting areas of the phosphor coating surrounding the actual target.  The grille or mask is grounded; hence any electrons which strike it are "short-circuited" (so to speak) before they have a chance of getting to the phosphor.

 

<BLOCKQUOTE><I>

and the design allows more electrons to reach the phosphors making a much brighter picture.

</I></BLOCKQUOTE>

 

Brighter, yes -- but also less sharp, for precisely the same reason.

 

<BLOCKQUOTE><I>

That was the main advantage of the aperture grille design for home television receivers - a brighter picture.

</I></BLOCKQUOTE>

 

That *may* be what initially sold it to the Great Unwashed Masses; but it is NOT what actually made it "better" for that application.

<br>

<br>

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...