H
hdjim69
- Jan 1, 1970
- 0
I'm self-teaching myself electronics and the only place to ask
questions is in this forum so please excuse me if this has been asked
a million time already but after reading several books, I still have
questions on these topics. No real values, just theory in this
question.
I'm just starting the section on AC and the book is explaining why we
(homes and industry) use AC instead of DC and the use of transformers.
Now, the books says the reason we use AC is to minimize power loss.
That homes and industry need a lot of current and if we were using DC
we'd need to push a huge amount of current through the transmission
lines and the higher the current the more we'd lose in heat loss. OK
fine. But now let's see what happens in AC. Rather then pushing a
huge amount of current we have a very high voltage say 200,000 to
600,000 volts and a low amount of amps (current). But how can we have
this HUGE amount of "pressure" (the typical explanation of what
voltage is) and hardly any current ? I've been reading that voltage
and current are proportional - the more voltage the more current.
Ahh... but this isn't the case really since current is a variable
value. It depends on the amount of resistance. So getting back to the
transmission lines, if we have HIGH voltage and LOW current then
resistance MUST be high. E = I * R that is, if I is low R must be high
to get a high value of E. And resistance is what cause heat which
causes power loss. So how can we have low current + low resistance =
high voltage ?
In summary, if we have very high voltage and low current we must have
very high resistance which would eliminate just about all the current
so loss would be almost 100%.
TIA
J
questions is in this forum so please excuse me if this has been asked
a million time already but after reading several books, I still have
questions on these topics. No real values, just theory in this
question.
I'm just starting the section on AC and the book is explaining why we
(homes and industry) use AC instead of DC and the use of transformers.
Now, the books says the reason we use AC is to minimize power loss.
That homes and industry need a lot of current and if we were using DC
we'd need to push a huge amount of current through the transmission
lines and the higher the current the more we'd lose in heat loss. OK
fine. But now let's see what happens in AC. Rather then pushing a
huge amount of current we have a very high voltage say 200,000 to
600,000 volts and a low amount of amps (current). But how can we have
this HUGE amount of "pressure" (the typical explanation of what
voltage is) and hardly any current ? I've been reading that voltage
and current are proportional - the more voltage the more current.
Ahh... but this isn't the case really since current is a variable
value. It depends on the amount of resistance. So getting back to the
transmission lines, if we have HIGH voltage and LOW current then
resistance MUST be high. E = I * R that is, if I is low R must be high
to get a high value of E. And resistance is what cause heat which
causes power loss. So how can we have low current + low resistance =
high voltage ?
In summary, if we have very high voltage and low current we must have
very high resistance which would eliminate just about all the current
so loss would be almost 100%.
TIA
J