Maker Pro
Maker Pro

Monitor's settings too high?

J

js5895

Jan 1, 1970
0
Hi,

I have a 5 year old, Gateway VX900 19" CRT monitor and I was told that
you should run your monitor at the maximum refresh rate, for that
resolution. Will this damage my monitor or make it age faster? What
about maximum brightness and contrast? Someone else told me that,
"Running it at a lower or higher refresh rate will hurt it",
"when they are too low (high frequency squeal)" and maximum brightness
and contrast, "Just burns out the phosphorus faster." I believe the
part about the phosphors, but, the frequency I'm not so sure of.
Because I currently have it at 1024x768, 81kHz/100Hz, with maximum
brightness and contrast, I usually have it at 1024x768, 38kHz/60Hz,
100% contrast and 50-75% brightness. I'm within the manufacturers
specifications, 31-95 kHz/50-160 Hz. It looks better, and is
faster/smoother, but, it looks a little blurry, but it's been getting
a little blurry at some screen resolutions and frequencies lately.

Thanks, for all your help.
 
If you areadly have five years use on the monitor, it is likey
approaching the end of its useful life, plus the refresh rate doesn't
normally impact the lifetime of a monitor.

The critical thing is to stay within the scan rate of the monitor,
otherwise you could theoretically cause damage to its deflection
circuitry. This is pretty difficult to do unless you are running Unix,
Linix, or one of its clones and set the scan parameters to some
ridiculous extremes.

If the monitor is only display symptoms of being blurry, this is
symptomatic of a montor with poor high-voltage regulation or a failing
CRT, in which case the cost of repair would likely exceed the cost of a
new monitor. Your probability of running into this problem is heavily
dependent on the brand of monitor you use. Over the past years I've
owed two generaric monitors that lasted only two years, then I move to
the Sony multi-synch monitors which are good for about 5 years, and for
the past 8 years I've been using a 19" Mistsubishii Diamond Pro 900u
(which originally cost around $900 and is still going strong and
sharp). When it fails to deliver performance (which it eventually
will), my next monitor will be one of those flat-panel screens.

The point here is that unless you drastically exceed the scan rate
parameters of of monitor, its useful life will not be noticably
affected. By contrast, when you purchase an inexpensive (say Sanyo or
something akin), expect a short useful lifetime for the device.

Hope this helps. Harry C.
 
J

js5895

Jan 1, 1970
0
Me I like Gateway, and this is a good quality, $600 monitor, also it's
the only 19" monitor that I noticed with a low horizontal dot pitch of
0.22mm, and the picture is beautiful.
 
M

Mark Haase

Jan 1, 1970
0
js5895 said:
Hi,

I have a 5 year old, Gateway VX900 19" CRT monitor and I was told that
you should run your monitor at the maximum refresh rate, for that
resolution. Will this damage my monitor or make it age faster? What
about maximum brightness and contrast? Someone else told me that,
"Running it at a lower or higher refresh rate will hurt it",
"when they are too low (high frequency squeal)" and maximum brightness
and contrast, "Just burns out the phosphorus faster." I believe the
part about the phosphors, but, the frequency I'm not so sure of.
Because I currently have it at 1024x768, 81kHz/100Hz, with maximum
brightness and contrast, I usually have it at 1024x768, 38kHz/60Hz,
100% contrast and 50-75% brightness. I'm within the manufacturers
specifications, 31-95 kHz/50-160 Hz. It looks better, and is
faster/smoother, but, it looks a little blurry, but it's been getting
a little blurry at some screen resolutions and frequencies lately.

Thanks, for all your help.

I don't think scan rate will wear out the components any faster, and I
have a hard time believing that phosphorus "burns out" at all. I'm using
a 17" Sony Trinitron that my mom bought at least 8 years ago, if not
more. I've been using it ever since I left for college, which was 4
years ago, which means its been running at max for at least that long.

I have to admit that I generally don't turn up the brightness or
contrast too much, because I like the black on my screen to be true
black. Maybe that's why its lasted so long. But Sony also simply makes
great stuff.
 
F

Fred Abse

Jan 1, 1970
0
I
have a hard time believing that phosphorus "burns out" at all

Go take a look at a screen that's been displaying the same thing 24/7 for
a few years, such as an industrial machine, or medical equipment. The
image is "burned" into the screen as a darker "image", visible when the
monitor is off.

Vertical scan collapse, on a TV or computer monitor screen will burn it in
minutes. A stationary spot will do it in seconds. The old Schmitt system
back-projection TVs had circuitry to kill the EHT and bias the CRT hard
off in the event of a scan failure, since a stationary spot could melt the
glass.

BTW, it's "phosphor", not "phosphorus", so called because it is
phosphorescent. The element phosphorus does not need to be present. The
original CRT phosphor back around the turn of the 19th century was zinc
sulfide.
 
Z

Zak

Jan 1, 1970
0
Fred said:
Vertical scan collapse, on a TV or computer monitor screen will burn it in
minutes. A stationary spot will do it in seconds. The old Schmitt system
back-projection TVs had circuitry to kill the EHT and bias the CRT hard
off in the event of a scan failure, since a stationary spot could melt the
glass.

Monitors have this as well.

I think that aging CRTs fail from cathode exhaustion mostly. Th emission
spot inreases in size, and the image gets fuzzy as a result.


Thomas
 
T

Tom MacIntyre

Jan 1, 1970
0
Go take a look at a screen that's been displaying the same thing 24/7 for
a few years, such as an industrial machine, or medical equipment. The
image is "burned" into the screen as a darker "image", visible when the
monitor is off.

As I recall, thought, monitors have a faster (I forget the exact term)
loss of brightness per line, so don't have such a tendency to burn in.
TV's will and have definitely had a tendency to do so.

Tom
 
F

Fred Abse

Jan 1, 1970
0
Monitors have this as well.

Some do, some don't
I think that aging CRTs fail from cathode exhaustion mostly. Th emission
spot inreases in size, and the image gets fuzzy as a result.

Secondary emission due to electrode contamination can cause similar
effects.
 
F

Fred Abse

Jan 1, 1970
0
As I recall, thought, monitors have a faster (I forget the exact term)
loss of brightness per line, so don't have such a tendency to burn in.

Depends on application. Industrial and medical monitors can have quite
slow (15-25KHz) scan rates.

Even at high rates, burn-in will be noticeable after a couple of years
running 24/7 on a stationary image, especially if it's text.
 
M

Michael Redmann

Jan 1, 1970
0
Fred said:
Depends on application. Industrial and medical monitors can have quite
slow (15-25KHz) scan rates.

Even at high rates, burn-in will be noticeable after a couple of years
running 24/7 on a stationary image, especially if it's text.

I'm not sure burn-in is dependent on refresh rates. I think that at any
refresh rates the screen phosphor is "illuminated" the same amount of
time (more or less) by electron beam. Perhaps the different types of
phosphor (monitor vs. TV) have different decay rates and that's the
reason for TVs to tend to burn-in.

Regards
 
F

Fred Abse

Jan 1, 1970
0
I'm not sure burn-in is dependent on refresh rates. I think that at any
refresh rates the screen phosphor is "illuminated" the same amount of time
(more or less) by electron beam. Perhaps the different types of phosphor
(monitor vs. TV) have different decay rates and that's the reason for TVs
to tend to burn-in.

You may well be right. I've not noticed industrial monitors running at
15KHz x 60Hz burn any worse than those running at 31.5KHz x 70-something
Hz. They both burn noticeably in a couple of years. It's down to watts per
square meter, I guess. Same light output, same power density, same
degradation.

Are TV phosphors different from monitor phosphors? TVs don't appear to
burn as much, but they generally don't run 24/7 on stationary patterns,
and the raster fully fills the screen area, whereas monitors are usually
underscanned slightly. It may just be less noticeable.
 
Top