Is a ViewSonic a low-end monitor? Mine is about 5 years old, gets two to
three hours' (at least) use each day, and is running fine. I have no
intention of buying an LCD display until this monitor fails (or I buy my
next computer).
I don't know much about ViewSonic. It's not, or hardly, available here. But
with low-end, I meant the cheapest models. I have a Philips 105S here, which
is one of those. Image quality has degraded significantly over the three years
I used it.
This Eizo monitor has been running about 12 hours per day or more (I work at
home) for the last few years. The hours of use I mentioned can be found in the
OSD menu.
There was a time when Eizo was pretty much _the_ monitor to own. It is not
longer a highly visible brand, at least not in the US.
I would agree. Their TFT screens are no better than others I've seen. As for
CRT, it's a long story, but I tried several T766 models and this one was the
only one without grave convergence errors. Philips high-end CRT range also
suffered from major convergence problems (I had tried 6 different 109p40's
before giving up and going for this Eizo...).
But, now that I have this specific one, I'm glad I have it. It still has
the "Eizo Legacy" of quality... Not just in image quality, but also in
features. I've never seen another monitor which allows you to set the color
cut-off in the OSD menu, for example. It could be that there are, but I've
never seen one
I've wondered why we haven't seen much in the plasma-based computer
displays. I can think of a number of reasons, including limited resolution
for a given screen size and significantly higher power consumption.
Don't plasma screens wear down very rapidly? Computerscreens are used more than
TV's, mostly, so perhaps that's the reason? Or, more likely, they want to
exploit the investments made for TFT as much as possible. It's better for them
to wait with the introduction of a new technology.
Way incorrect.
All current LCDs use additive color. Some years back there were LCD panels
for transparencey projectors that used subtractive synthesis. This allowed
significantly higher resolution, but had no other advantage I can think of.
If you want to get picky about it, subtractive systems are "additive" in
that (for example) the yellow layer -- and only the yellow layer -- controls
the amount of blue light in the image. The blue light passed by the yellow
layer is "added" to the green light passed by the magenta layer and the red
light passed by the cyan layer. (This is all semantics, of course. I'm
making a point, not trying to "prove" something.)
I have no idea what you mean by "color polarization". The polarized layers
in an LCD are _not_ used to create colors (such as the colors seen when
placing plastics between crossed polarizers).
As it says at [1]: "LCD technology is based on the properties of polarized
light (...) When an LCD pixel darkens, it polarizes at 90 degrees to the
polarizing screens.". However, it could be that I've got stuff mixed up. I do
know that wearing polarized sunglassed can make an LCD display (like on a
watch) unreadable, but I've never tried it on a TFT LCD screen.
As for the subtraction; the process of creating color in TFT screens starts
with white light, which is then turned into the desired color. I call that
subtraction. It may not be literal subtraction from the point of view of the
sub pixel, but I do think this idea is principally flawed. You cannot make
monochromatic light for the RGB colors that way. The phosphor photon-emission
of a CRT comes much closer, if not completely.
Anyway, you seem to know more about TFT screens that I do, but that doesn't
take away the fact that I don't like them
. Image quality is just too poor,
and the manufacturers cheat to get higher specs, and those cheats are visible.
[1]
http://en.wikipedia.org/wiki/Liquid_crystal_display_television