# MMf-EMf(VMf) two ways to cause a current.

Discussion in 'Electronic Basics' started by Xtrchessreal, Jan 28, 2006.

1. ### XtrchessrealGuest

You cannot have a current without something to push the electrons off
of their covalant buns. That is done one of two ways either applying a
Magneto-motive force or a Electro-motive force. I don't know of the
application off hand but I suppose you can have both at the same time
as well, each applying a portion of the force to make the electron
move.

On a 20 amp circuit if you short the line to the neutral you have a
current of 20 amps for a moment before the breaker opens. 120 VAC,
2400 watts, 20 amps.

If you wanted to keep the breaker from opening you could stick a
resistor across the line and neutral. The resistor would need to be
able to dissapate many watts without burning out. The idea is to lower
the current so that you don't open the breaker at its rated value 20
amps of current.

Say you want to reduce the current to only 10 amps you would need a 12
ohm resistor capable of handling 1200 watts of dissapation. Or you
could place 1200 one watt resistors in parallel that are equiv to one
12 ohm resistor. On the neutral side of the resistor you would measure
0 volts and on the line side you would measure 120 volts.

Of course if the 120 VAC 20 amp circuit was not able to deliver 2400
watts then the supply has the problem of being out of design spec. IOW
the supply cannot deliver a 20 amp current as it was specified.

When designing a new electronic device you need to know the total
amount of power dissapation of the device especially if you intend to
build a DC power supply for the delicate circuitry within. The DC
power supply needs to be designed for the total dissapation and then
you need to also know the maximum current that can be supplied by the
power supply.

I am just writing out my thoughts as I try to understand these things.
I am not going in any particular direction except for one.

The resistor is a device that limits current for a circuit and that is
the way I understand it. When its fuction is to create a voltage
inside a circuit I get confused. I get confused because the
terminology is twisted and goes against normal thinking.

Specifically, current is not possible without voltage or induction so
how can you have a voltage be created by a resistor that is designed to
limit current?

I know this is done in many ways and and is done all the time. Maybe
it would help to know what the power dissapation is for some of the
devices in an amplifier circuit using an Op amp while it is in its
linear operation and then its maximum dissapation. Then I can think of
it like I think of the 120 VAC in my house on a 20 amp breaker - there
is a limit to the Current and the Voltage and it is easy to figure the
circuitry.

Somehow the books miss explaining things that should be easy to
understand. When a Op amp is shown with its supply voltage inputs the
should also state what the maximum current available is. That way you
can use either Current or Voltage to help you solve a circuit problem.

2. ### John PopelishGuest

Xtrchessreal wrote:
(snip)
(snip)

Not quite. You have to apply ohm's law to calculate the current
before the breaker opens. For instance, if the total resistance in
the loop (the source resistance, the breaker resistance and the hot
and neutral wire resistance add up to 1/10th ohm, the current is
120/.1= 1200 amperes till the breaker opens.
Actually, no single load on a distribution breaker is allowed to use
all the current the breaker will pass without opening. A 15 amp
circuit can have no legal load that draws more than 12 amperes,
continuously.
(snip)

And the efficiency of the supply, so you can work back to the power
needed from the AC circuit. You may also need to learn something
about power factor. The 12 amp limit for a load on a 15 volt branch
circuit provides the full 12*120=1440 watts if the current is
sinusoidal and in phase with the voltage. Any other wave shape (like
the narrow pulses of current drawn by rectifier with a capacitor input
filter) or phase relationship reduces the watts that 12 amperes can
deliver.
Another way to say ohm is volts per ampere. A 1 ohm resistor requires
1 volt across it to have 1 ampere pass through it. A 1000 ohm
resistor requires 1000 volts across it before 1 ampere will pass
through it.
A resistor relates voltage to current by a proportionality factor (if
it is a linear resistor). Its ohms are just that factor of
proportionality. If you force a current through it, you multiply that
current by the factor (resistance to calculate how much voltage that
took. If you connect two resistors in series across a voltage, they
divide it up inversely to their relative resistances. For instance,
if you connect a 9 ohm resistor and a 1 ohm resistor across 100 volts,
the 9 ohm resistor drops (uses up) 9/10ths of the 100 volts and the 1
ohm resistor drops 1/10th of the 100 volts. You could also get there
by calculating the total resistance (1+9=10 ohms), divide the voltage
by that to know what current is passing through both resistors, and
then multiply that current times each of the resistances to know how
much voltage drop appears across each.
If the resistance is zero (superconductor loops) i is possible to have
very significant current circulate forever with no voltage any where
in the loop.
(snip)

Voltage is consumed (dropped) as current passes through a resistor.
The resistor does not create the voltage, which must come from
whatever is pushing that current.

3. ### BrianGuest

You are approaching this in the wrong way. When designing a new electronic
device (as far as the power is concerned), first you see what kind of
voltage and current it needs to work. You then design the power supply to
provide the voltage needed and the current needed. Using ohms law W = E I,
you then know the wattage the power supply will be supplying.
Let's say you have a 9 volt battery. You put a 1000 ohm resistor across it.
Without making any calculations, you know that the resistor will have 9
volts across it. To know how much current is going through it, you need to
use Ohms Law, I = E / R. The resistor is limiting how much current the 9
volt battery can make go through the 1000 ohm resistor. To know how much
power the resistor is dissipating, we use ohms law W = EI. Now put two 500
ohms resistors in series, across the 9 volt battery. Because the two 500
ohms resistors in series still equal 1000 ohms, the current flow will still
be the same as before. If you measure the voltage across each of the
resistors, it will be 4.5 volts. The current through each resistor causes a
voltage drop of E = IR. Sometimes we use resistors to create a given voltage
(voltage divider circuit), sometimes we use a resistor to limit the current
(input current to a transistor).

I think you are getting hung up on power dissapation. Power dissapation is
more of a byproduct. Sure you need to know what the power dissipation is of
the components (but not to be able to understand how a circuit works). The
resistors and capacitors of an Op amp circuit are used to set the gain and
frequency characteristics of that stage.

4. ### BrianGuest

As an example: If you tell me that you have a relay that will consume 8
watts to operate. That doesn't tell me much. But if you tell me that a relay
needs 8 volts at 1 amp to operate, that tells me what kind of power supply
it will need. Maybe I already have a power supply that I want to use, that
puts out 9 volts and can handle up to 2 amps. I could use a resistor in
series, to drop 1 volt at 1 amp. So using ohms law -- R = E/I and W = EI, we
would use a 1 ohm resistor that needs to be able to handle at least 1 watt.
If we didn't use the resistor in this example, the relay would overheat
because it will have more than 1 amp passing through it (because the voltage
is too high for it). The resistor is both reducing the voltage and current
to the relay.

Brian  