R
RJ_32
- Jan 1, 1970
- 0
I've just read that a 25" tv (the set is 20 yrs old) uses only about 100
watts. Is that true? It seems low.
watts. Is that true? It seems low.
RJ_32 said:I've just read that a 25" tv (the set is 20 yrs old) uses only about 100
watts. Is that true? It seems low.
I've just read that a 25" tv (the set is 20 yrs old) uses only about 100
watts. Is that true? It seems low.
RJ_32 said:I've just read that a 25" tv (the set is 20 yrs old) uses only about 100
watts. Is that true? It seems low.
Don said:That sounds just a little low to me, but may be correct. My 17" CRT
monitor consumes about 90 watts. My maybe 3 year old 20 inch CRT TV
consumes 70 watts.
That sounds just a little low to me, but may be correct. My 17" CRT
monitor consumes about 90 watts. My maybe 3 year old 20 inch CRT TV
consumes 70 watts.
- Don Klipstein ([email protected])
T said:Meanwhile the power brickette for my laptop eats 200W.
how can that be so much? Is the rating what it's actually consuming, or is
that the maximum possible?
If the CPU is the biggest power consumer, then that
usage varies a lot, right? Or does the screen use more than the CPU?
T said:Meanwhile the power brickette for my laptop eats 200W.
Don said:If it's the usual "LCD with backlight" screen, I'd expect that operating
the disk drive (spinning the platters, moving the heads, plus more power
required to write) sucks more power than the screen, and quite possibly
more than the CPU.
Don said:If it's the usual "LCD with backlight" screen, I'd expect that operating
the disk drive (spinning the platters, moving the heads, plus more power
required to write) sucks more power than the screen, and quite possibly
more than the CPU.
how can that be so much? Is the rating what it's actually consuming, or is
that the maximum possible? If the CPU is the biggest power consumer, then that
usage varies a lot, right? Or does the screen use more than the CPU?
(My Dell laptop's PS says 90 watts.)
Keep reading. There are two sets of numbers. There is what it provides
to the laptop (90W) and what the power supply itself consumes based on
125VAC.
In my case, my 65W Dell power supply uses 187.5W of AC power.
Not unless it's brick-sized or runs too hot to touch does it manage
that. possibly you are confusing VA with Watts - they are not the same.
More than "too hot to touch". If it dissipates 122.5 watts, it'll melt or explode.
Here are the numbers right from the supply:
AC: 100-240V 1.5A
DC: 19.5V 3.34A
So either there's a different formula for AC wattage or what's printed
on the supply is wrong.
I've always know Power = Voltage * Current
So 125V * 1.5A = 187.5W
19.5V * 3.34A = 65.13W
W= V * A
But only if the voltage and current are in phase, which is not
necessarily the case with AC. Hence the distinction between
VA - which is what you get when you simply multiply the volts
time the amps in AC - and watts, in which the calculation corrects
for the possible phase difference. To get watts, which is the
"real" part of volt-amps, the calculation is:
W = V * A * cos(p)
187.5 VA (now W). The current listed is likely the peak current,
which is the rating needed for wires, switches, transformers, etc.
That would be the maximum power the PC could consume. Divide by
.7-.8 or so and you get the maximum power drawn by the brick (from
the wall).
unless A is a sine wave it's not that simple.