# Watts equation

Discussion in 'General Electronics Discussion' started by dbyrd26, Jul 25, 2014.

1. ### dbyrd26

51
0
Sep 16, 2013
I've been told in order to figure out watts of a device you multiply input voltage by current. Or to figure out how much current you're pulling you divide the watts by the voltage. I'm from america where 110 is typical mains voltage. From my understanding 220 is typical in europe? Anyways, If you have a 100 watt lightbulb, does the amount of current you're using change if the input voltage is 220 instead of 110?

2. ### BobK

7,682
1,688
Jan 5, 2010
Yes, it changes very rapidly, from too much to none. In other words, the light bulb blows out.

If the bulb was designed to work on 220V, it will use half the current to make 100W as one designed to run on 110V.

There are devices (for example most laptop power supplies) that will run on either voltage. For those devices, yes the current will be less when running from 220V then when running from 110V.

Bob

3. ### KrisBlueNZSadly passed away in 2015

8,393
1,271
Nov 28, 2011
That's right. I would say you multiply the voltage ACROSS the device by the current flowing THROUGH the device.

That's true for DC. Things can get tricky with AC when capacitance and/or inductance are involved, but that's not relevant for incandescent light bulbs.
Yes, and in Britain, Australia and New Zealand, South Africa I think, and some other places.

If you're asking "does a 100W American light bulb draw a different amount of current from a 100W European light bulb", the answer is yes. A bulb that's designed to consume 100W at 110V will draw 0.91 amps (from the formula I = P / V, where I is current in amps, P is power in watts, and V is voltage in volts) at 110V. A bulb that's designed to consume 100W at 220V will draw 0.45 amps (by the same formula) at 220V. In both cases the power consumption is 100W, but since the voltages are different, the currents are different too. Higher voltage × lower current = lower voltage × higher current.

If you're asking "what happens if I take an American light bulb and connect it to a European power outlet", then you need to understand resistance. Resistance is the characteristic that relates voltage to current, for a given device. The unit is ohms, "Ω". You can measure the resistance of an incandescent light bulb with a multimeter. The resistance of a light bulb actually changes somewhat, depending on how hot the filament is, but for the sake of this explanation, I will pretend that it doesn't.

Resistance is related to voltage and current through Ohm's Law, probably the most important and basic formula in electronics. There are three arrangements of Ohm's Law:
• I = V / R
• R = V / I
• V = I × R
where
I is current, in amps (A), flowing through the device;
V is voltage, in volts (V), measured across the device;
R is resistance, in ohms (Ω), of the device.

Your American 100W light bulb draws 0.91A at 110V. Using Ohm's Law arranged as R = V / I we can calculate its resistance as about 121Ω.

A European 100W light bulb draws 0.45A at 220V, so its resistance is about 484 ohms.

If you connect your American light bulb, with its resistance of 121Ω, to a 220V supply, how much current will flow? Using Ohm's Law arranged as I = V / R we can calculate the current as 1.8 amps. That's twice what it should be!

Now we can calculate the power that will be dissipated by the bulb, using the power law, P = V × I. That gives us P = 220 × 1.8 which is 400 watts!

So the answer to that question is that if you connect an American bulb to a European supply, it will dissipate four times as much power as it should, and it will glow very brightly for a very short time, then it will be forever dark.