basic question about resistance and Ohm's law

Discussion in 'Electronic Design' started by [email protected], Jun 30, 2013.

1. Guest

Hi,
There is some basic stuff I don't understand about resistance. Legend holdsit that resistance works by transforming power into heat. So why then, I ask, would it result in decreased power usage by adding a resistor in a circuit? In other words, let's say I have a light bulb in a 12 V circuit and the current is 1 amp and thus the power usage is 12W. Thus the resistance is 12 Ohms, If I now add a 12 Ohms resistor to the circuit it doubles the resistance and cuts the current and thus the power usage in half. But why? If the lamp now uses half the power of the original circuit wouldn't the resistor also use half the power by turning it into heat and once again the power usage would be 12W? Confused. Thank you,

2. whit3rdGuest

Resistor, capacitor, and inductor are three kinds of ideal linear components, that
pass electrical current according to the voltage difference on two attachment points (terminals).
Resistor is the only one that dissipates electrical power.
All REAL components in an electrical circuit have some combination of these
behaviors, i.e. are not IDEAL.

The key point here, is that 'adding a resistor in a circuit' can be many different things.
The resistor in a battery/switch/lamp flashlight can be in series with the lamp, or in
parallel, or in series with the battery, or in parallel, or in series with the switch, or in
parallel.

If the switch is CLOSED and the resistance is in series with the lamp (which is itself
a not-very-ideal resistor), it can turn the flashlight OFF if the resistor resistance is high
compared to the lamp resistance. In that case, more resistance causes less power.

3. tmGuest

Hi,
There is some basic stuff I don't understand about resistance. Legend holds
it that resistance works by transforming power into heat. So why then, I
ask, would it result in decreased power usage by adding a resistor in a
circuit? In other words, let's say I have a light bulb in a 12 V circuit and
the current is 1 amp and thus the power usage is 12W. Thus the resistance is
12 Ohms, If I now add a 12 Ohms resistor to the circuit it doubles the
resistance and cuts the current and thus the power usage in half. But why?
If the lamp now uses half the power of the original circuit wouldn't the
resistor also use half the power by turning it into heat and once again the
power usage would be 12W? Confused. Thank you,
++++++++++++++++++++++++++++++++++++++++

Clue:

The bulb has 6 volts at 0.5 amps = 3 watts

4. Guest

Thank you for your replies. Maybe I didn't make myself clear enough:
I understand Ohm's law and I understand what resistance means in a circuit.The problem I have understanding is that people say resistance turns into heat. If somebody told me that resistance only makes the pathway smaller for current (like a tap in a water pipe) and thus less power goes through then I would completely understand everything about simple circuits and Ohm's law. But people say a resistor uses (dissipates) power. That's what I don'tget. thanks

5. tmGuest

Right. Energy can not be destroyed. It can only be converted into smoke.

6. rickmanGuest

Your terminology is a bit confusing. You are describing the circuit in
words when this is a math concept. Yes, resistance turns current and
voltage (which determines power) into heat. But to figure how much heat
or power you have to do the math. Others have replied with some of the
numbers.

You are describing the circuit in words but aren't analyzing it properly
because you are making assumptions and aren't doing the math.

When you add the 12 ohm resistor you cut the current in half and the
*total* power is cut in half. But the light bulb dissipates a quarter
of the original power (half the current times half the voltage) and the
resistor dissipates a quarter of the original power totaling half.

7. Phil AllisonGuest

<>

There is some basic stuff I don't understand about resistance. Legend holds
it that resistance works by transforming power into heat. So why then, I
ask, would it result in decreased power usage by adding a resistor in a
circuit? In other words, let's say I have a light bulb in a 12 V circuit and
the current is 1 amp and thus the power usage is 12W. Thus the resistance is
12 Ohms, If I now add a 12 Ohms resistor to the circuit it doubles the
resistance and cuts the current and thus the power usage in half. But why?

** Because the current drawn from the battery is cut in half.

0.5 amps at 12V = 6 watts.

** That is where you are your mistaken.

The lamp now has BOTH half current AND half voltage since the resistor has
6V across it.

0.5 amps at 6V = 3 watts.

Resistors do two things, not one - they reduce current flow and reduce the
available voltage in a circuit.

.... Phil

8. rickmanGuest

Yes, that is definitely the problem here. The OP is confused because he
knows the light bulb is non-linear.

9. Phil AllisonGuest

"rickman"

** I really doubt the OP is aware of any such thing.

His mistake was to not do the math - as you already said.

.... Phil

10. rickmanGuest

You need to learn to recognize sarcasm. Sometimes the smiley isn't given.

11. Phil AllisonGuest

"rickman"
** OK , that was a tad subtle for this NG.

I often use " .... " at the end of such remarks in lieu of a smiley.

..... Phil

12. Guest

I guess my question is: why is Ohm's law correct and so different than common sense? If I knew nothing about electronics common sense would say that the power consumption of one circuit illuminating a light bulb and a second circuit illuminating a light bulb (albeit dimmer) *and* heating up a resistor (of the same resistance as bulb) in series would be equal or even more.

my second question (not really related) is: so are resistors evil things? It seem like nobody would want to waste energy by creating heat. Is there noway to "throttle" the flow of electrons without wasting power by turning it into heat?

Along those lines, if I run a 6V circuit with a 6 Ohm bulb (pretending bulbs are linear resistors) it would consume 6W. If I want to power the same bulb by a 12V battery I would have to add a 6Ohm resistor to have the bulb illuminate at the same brightness. This circuit would consume 12W of power. That seems like an incedible waste of power. What am I missing here?

13. Phil AllisonGuest

<>
I guess my question is: why is Ohm's law correct and so different than
common sense?

** You need to improve your common sense.

If I knew nothing about electronics common sense would say that the power
consumption of one circuit illuminating a light bulb and a second circuit
illuminating a light bulb (albeit dimmer) *and* heating up a resistor (of
the same resistance as bulb) in series would be equal or even more.

** And if you considered the situation with battery drain, you might see
just how dumb that is.

My second question (not really related) is: so are resistors evil things?

** Many engineers consider them evil.

Along those lines, if I run a 6V circuit with a 6 Ohm bulb (pretending bulbs
are linear resistors) it would consume 6W. If I want to power the same bulb
by a 12V battery I would have to add a 6Ohm resistor to have the bulb
illuminate at the same brightness. This circuit would consume 12W of power.
That seems like an incedible waste of power. What am I missing here?

** The bleeding obvious.

Use 6V lamps with only with 6V batteries.

Why do you think that lamps exist in all ratings from 1.5V up to 240V ?

.... Phil

14. Jasen BettsGuest

Assuming the lamp still has 12 ohms resistance when operated from 6
volts (It won't, lamps don't work like that, but that's immaterial to
this example and would only make the arithmetic harder)

No, the lamp now only uses one quarter of the original it sees 6V and
passes 0.5A for a total of 3W, the resistor sees another 3W.

for 3W+3W = 6W total energy used.

15. Martin BrownGuest

You need to upgrade your common sense. Ohms law is very basic and
fundamental to understanding electronics. The resistance or lack of it
determines the amount of current that can flow in a circuit. The voltage
dropped across a resistive component multiplied by the current flowing
through it determines how much power it will dissipate.

At constant applied voltage the current is halved by adding an equal
value series resistance to your bulb (actually it isn't because at a
lower current the cooler bulb filament offers less resistance). An
examiner marking scheme got this one wrong in an exam once and I turned
up with a page of algebra and a colleague turned up with a plank with
the actual circuit nailed to it! It was agreed to give marks for both
Electric fire bar or fan heater is the most common application of
deliberately turning electricity into heat - an air source heat pump
would be more efficient but turning electricity into heat is done.
Ultimately all high grade energy ends up being dissipated as heat as an
inevitable consequence of thermodynamics.

For many applications you can get away with switching the maximum
voltage on and off very quickly as a form of pulse width modulation to
avoid having any significant resistive losses. Class-D amplifiers use
this method to get high efficiency with less waste heat.

http://en.wikipedia.org/wiki/Class_D_Amplifier
These days if you wanted to do that you would probably use PWM or a DC
to DC converter to provide 6v from the 12v source at the right current
and high efficiency. Using a power rheostat (variable resistor) to
control current flowing in a motor or other load is much rarer these
days than it was in the past. See for example:

http://en.wikipedia.org/wiki/Pulse-width_modulation

16. Uwe HercksenGuest

Hello,

a resistor turns the electrical power into heat, that is true. But that
does not mean more resistance will consume more power and produce more heat.
It depends on the circuit and the source used.
If you connect a resistor to a constant voltage source, more resistance
will consume less power. More resistance means less current, the voltage
stays the same, the product of current and voltage is power and gets
smaller.
But if you connect a resistor to a constant current source, more
resistance will consume more power. The current stays the same, the
voltage over the resistor rises and the power rises too.

Bye

17. Charlie E.Guest

Ok, let me put my two cents in...

There are two concepts, Ohm's law about the relationship between
voltage, current and resistance, and power formulas, that describe how
much heat or whatever are being produced by a circuit. They are
related, but not the same.

When you have a simple circuit like you described with a battery,
light and resistor, you have to look at two things. First, Ohm's law
will tell you what voltage and current are across each part. Then,
you can take that voltage and current, and determine the power across
each part.

Now, when you increase resistance, you reduce the current. Reducing
the current causes the power to go lower. Remember, the formula for
power is V^2/R, so increasing R lowers overall power. If you look at
a heater, realize that the overall resistance of that element if often
small to use a lot of power...

Charlie

18. Fred AbseGuest

It isn't removed from common sense:

Push something harder, it'll move faster.

It's easier to swim in water than in molasses.

19. Guest

I think I understand now. Thank you. Too bad there is not an electronic equivalent to a water pipe faucet. It just reduces the flow without wasting energy (water) while a resistor does waste power instead of just making it harder for current to flow through, preserving the rest of the power in the battery, rather than wasting it by turning it into heat. Thanks again.

20. Martin BrownGuest

There *is* though. You switch hard on and off quickly with a variable
mark space ratio to avoid resistive losses in the load controller.

It is the basis of all modern switched mode power supplies, lamp dimmers
and cheap high power class-D amplifier designs.

The days of the power rheostat are long gone!