Maker Pro
Maker Pro

watts for a tv

R

RJ_32

Jan 1, 1970
0
I've just read that a 25" tv (the set is 20 yrs old) uses only about 100
watts. Is that true? It seems low.
 
B

Bob Myers

Jan 1, 1970
0
RJ_32 said:
I've just read that a 25" tv (the set is 20 yrs old) uses only about 100
watts. Is that true? It seems low.

Depends on the display technology, etc., but that's not
at all unreasonable.

Bob M.
 
D

Don Klipstein

Jan 1, 1970
0
I've just read that a 25" tv (the set is 20 yrs old) uses only about 100
watts. Is that true? It seems low.

That sounds just a little low to me, but may be correct. My 17" CRT
monitor consumes about 90 watts. My maybe 3 year old 20 inch CRT TV
consumes 70 watts.

- Don Klipstein ([email protected])
 
E

Eeyore

Jan 1, 1970
0
RJ_32 said:
I've just read that a 25" tv (the set is 20 yrs old) uses only about 100
watts. Is that true? It seems low.

Not implausible. Very energy efficient compared with some modern TVs.

Graham
 
E

Eeyore

Jan 1, 1970
0
Don said:
That sounds just a little low to me, but may be correct. My 17" CRT
monitor consumes about 90 watts. My maybe 3 year old 20 inch CRT TV
consumes 70 watts.

Well my Sony E530 Multiscan monitor consumes allegedly 130W active but is 21"
and does insane scan and pixel numbers. Plus makes LCDs look stupid.

Only running 1280x1024 @85Hz right now though.

Graham
 
T

T

Jan 1, 1970
0
That sounds just a little low to me, but may be correct. My 17" CRT
monitor consumes about 90 watts. My maybe 3 year old 20 inch CRT TV
consumes 70 watts.

- Don Klipstein ([email protected])

Meanwhile the power brickette for my laptop eats 200W.
 
R

RJ_32

Jan 1, 1970
0
T said:
Meanwhile the power brickette for my laptop eats 200W.

how can that be so much? Is the rating what it's actually consuming, or is
that the maximum possible? If the CPU is the biggest power consumer, then that
usage varies a lot, right? Or does the screen use more than the CPU?

(My Dell laptop's PS says 90 watts.)
 
D

Don Bruder

Jan 1, 1970
0
how can that be so much? Is the rating what it's actually consuming, or is
that the maximum possible?

*USUALLY*, the rating on a PS refers to how much total power (probably
at multiple voltages) it can supply before something's magic smoke leaks
out and ends all the fun.
If the CPU is the biggest power consumer, then that
usage varies a lot, right? Or does the screen use more than the CPU?

If it's the usual "LCD with backlight" screen, I'd expect that operating
the disk drive (spinning the platters, moving the heads, plus more power
required to write) sucks more power than the screen, and quite possibly
more than the CPU. Making stuff more substantial than electrons move is
pretty energy-intensive. After the disk drive and CPU, audio output is
another power-hog (once again back to the "making things move is
expensive" concept) and any sort of wireless networking is, almost
literally, throwing power out the window by the bucketful. (at least,
anytime the network is "talking", it will be)
 
E

Eeyore

Jan 1, 1970
0
T said:
Meanwhile the power brickette for my laptop eats 200W.

All the time it's operative ? I bet my desktop doesn't exceed that.

Graham
 
E

Eeyore

Jan 1, 1970
0
Don said:
If it's the usual "LCD with backlight" screen, I'd expect that operating
the disk drive (spinning the platters, moving the heads, plus more power
required to write) sucks more power than the screen, and quite possibly
more than the CPU.

No way. Laptop HDs are power misers these days. Invariable the CPU is one of the
highest on the list probably after by the display but possibly ahead of it in
those cases where fatheads HAVE to have the latest bleeding edge gizmo to write
Word documents <spit>.

Graham
 
N

Nicholas Sherlock

Jan 1, 1970
0
Don said:
If it's the usual "LCD with backlight" screen, I'd expect that operating
the disk drive (spinning the platters, moving the heads, plus more power
required to write) sucks more power than the screen, and quite possibly
more than the CPU.

Keeping the platters spinning doesn't take much, they have really good
bearings. The weight of the read arm is probably minuscule. Coding
Horror has some measurements of the power consumption in a real laptop:

http://www.codinghorror.com/blog/archives/000562.html

Cheers,
Nicholas Sherlock
 
T

T

Jan 1, 1970
0
how can that be so much? Is the rating what it's actually consuming, or is
that the maximum possible? If the CPU is the biggest power consumer, then that
usage varies a lot, right? Or does the screen use more than the CPU?

(My Dell laptop's PS says 90 watts.)

Keep reading. There are two sets of numbers. There is what it provides
to the laptop (90W) and what the power supply itself consumes based on
125VAC.

In my case, my 65W Dell power supply uses 187.5W of AC power.
 
J

Jasen Betts

Jan 1, 1970
0
Keep reading. There are two sets of numbers. There is what it provides
to the laptop (90W) and what the power supply itself consumes based on
125VAC.

In my case, my 65W Dell power supply uses 187.5W of AC power.

Not unless it's brick-sized or runs too hot to touch does it manage
that. possibly you are confusing VA with Watts - they are not the same.
 
T

T

Jan 1, 1970
0
Not unless it's brick-sized or runs too hot to touch does it manage
that. possibly you are confusing VA with Watts - they are not the same.

W= V * A

I know the supply voltage coming in, it's 125V.
 
T

T

Jan 1, 1970
0
More than "too hot to touch". If it dissipates 122.5 watts, it'll melt or explode.

Here are the numbers right from the supply:

AC: 100-240V 1.5A
DC: 19.5V 3.34A

So either there's a different formula for AC wattage or what's printed
on the supply is wrong.

I've always know Power = Voltage * Current

So 125V * 1.5A = 187.5W
19.5V * 3.34A = 65.13W
 
K

krw

Jan 1, 1970
0
Here are the numbers right from the supply:

AC: 100-240V 1.5A
DC: 19.5V 3.34A
So either there's a different formula for AC wattage or what's printed
on the supply is wrong.

W = V*A is only true for purely resistive loads, which your brick
almost certainly isn't.
I've always know Power = Voltage * Current

So 125V * 1.5A = 187.5W

187.5 VA (now W). The current listed is likely the peak current,
which is the rating needed for wires, switches, transformers, etc.
19.5V * 3.34A = 65.13W

That would be the maximum power the PC could consume. Divide by
..7-.8 or so and you get the maximum power drawn by the brick (from
the wall).
 
B

Bob Myers

Jan 1, 1970
0

But only if the voltage and current are in phase, which is not
necessarily the case with AC. Hence the distinction between
VA - which is what you get when you simply multiply the volts
time the amps in AC - and watts, in which the calculation corrects
for the possible phase difference. To get watts, which is the
"real" part of volt-amps, the calculation is:

W = V * A * cos(p)

Where "p" is the phase difference between the voltage and
current waveforms. Note that if there is no difference, cos(p)
will be 1, and you get the simpler W = V * A.

I suspect, though, that what's really going on here is that you're
confusing a peak or inrush current rating with the actual (typical)
drawn during normal use.

Bob M.
 
J

Jasen Betts

Jan 1, 1970
0
But only if the voltage and current are in phase, which is not
necessarily the case with AC. Hence the distinction between
VA - which is what you get when you simply multiply the volts
time the amps in AC - and watts, in which the calculation corrects
for the possible phase difference. To get watts, which is the
"real" part of volt-amps, the calculation is:

W = V * A * cos(p)

unless A is a sine wave it's not that simple.
 
J

Jasen Betts

Jan 1, 1970
0
it's the same formula, but as the AC voltage fluctuates (and changes
sign) the wattage of an AC device is the average wattage of the device
measures at each point over the entire AC cycle.

I don't think I esplained that well...

AC watts = f * ∫(theta 0->2pi) V*A (d theta)

that's probably clear as mud too,

see this page:
http://en.wikipedia.org/wiki/Power_factor
especially this bit:
http://en.wikipedia.org/wiki/Power_factor#Non-linear_loads
187.5 VA (now W). The current listed is likely the peak current,
which is the rating needed for wires, switches, transformers, etc.

1.5A is probaby full load RMS current, peak (and inrush) will be higer than that.
That would be the maximum power the PC could consume. Divide by
.7-.8 or so and you get the maximum power drawn by the brick (from
the wall).

yeah.
 
B

Bob Myers

Jan 1, 1970
0
unless A is a sine wave it's not that simple.

Of course not, but please keep in mind the OP had
no idea what "power factor" or "VA" were in the first
place. You don't take up all the complications right
at the start.

Bob M.
 
Top