Ross said:
DarkMatter wrote:
On Tue, 27 Jan 2004 05:03:52 GMT,
[email protected] Gave us:
It will let you determine the amperage used by any
of your devices with pretty good accuracy and at low
cost.
Extremely good accuracy. In fact. As accurate as the resistor, and
meter are. :-]
Yes!! It's often a pain in the ass to see the
error introduced when using an ammeter in series
with the load - and sometimes worse than that,
screwing up the circuit. I can't believe that some
respondents can't see the value of the method.
What type of ammeter do you use for such measurements?
I'll answer this question and others inline, as thoroughly
as I can, below, to try to resolve this as best I can.
That makes it long, of necessity.
That depends on what I'm doing. A couple of examples
with a project I'm currently working on:
I've got a battery pulser that sends a short pulse
of 3.2 amps into the battery. Running the pulse in
series through an amp meter, or putting a clamp on
the wire produces a pronounced affect on the waveform
due to the change in inductance. Hell - I can't even
move the short connecting wires without changing the
waveform. So I have to use a resistor in the circuit,
and measure the voltage across it with the scope to get
a current measurement. When I discharge the battery
(12V auto battery) into an auto headlamp, I am looking
for a precision discharge to exactly (as close as my
Fluke's accuracy) 11.89 volts at room temperature. I
want to be able to see the exact current drawn at every
10 mv step along the way below 12.2, again, within the
accuracy of my test equipment. I can do that with a .01
ohm 1% precision resistor in series with the headlamp,
measuring the voltage across the resistor, and get a more
accurate reading than the Fluke gives, because there is a
voltage drop across the Fluke when I put it in series with
the load to measure amps directly.
In general, I've run into this problem over and over
when I want "exact" (as close as my equipment will allow)
measurements. There is a voltage drop across the
ammeter when it is in series with the circuit being
measured. Try this yourself: set up a circuit that
draws (as measured with your ammeter in series with
the load) current that is a little less than the
maximum for the range the meter is set to. Now,
turn the meter range selector up to the next higher
range. You'll often (always?) get two different
readings. Ask yourself which one is correct? Why
are they different? (Answers below)
And what type of
accuracy are you looking for?
Try the test I mentioned above. The accuracy
needed would depend on the circuit - but I
would like readings on 2 different scales
to be within 2% of each other. That is unreasonable:
say one range is 2000 mA and the other is 200 mA.
A 1% error on 2000 mA is 20 mA. A 1% error on the
200 mA scale is 2 mA. If I am measuring 100 mA,
the meter could say 98 mA on the 200 mA scale,
and it could say 120 mA on the 2000 mA scale.
Now try the test, measuring voltage across a
precision resistor. Ask yourself why that method
yields a far more consistent current. With the
precision resistor and different voltage scales,
I get what I want. I know my measurement is within
the accuracy of my equipment. The reason is that
putting an amp meter in series with the load
changes the circuit, by adding an unknown R
(the resistance of the test leads and internal
meter shunt) - and changing the meter setting
changes the circuit again, by changing the shunt
R. With the .1 ohm precision resistor measurement,
there is no change to the circuit, other than the
known R. On voltage scales, the internal impedance
of the meter is negligible with respect to circuit,
regardless of scale. An ugly thing with DMM's is
that on the amps scales, the closer the scale is
to the current being measured, the more the meter
affects the current being drawn. The farther away
it is over the actual current being drawn, the less
it impacts the circuit - but the lower the resolution
is. Going under the range (e.g. setting the meter to
the 200 mA scale to measure 300 mA) is a non-starter:
you can't get a reading and you may blow the fuse
or cook the meter if it's not protected.
And how does that relate to the OP who wanted
an approximation?.....Just a curiosity at this point.
The op wanted 1) an approximation of the current his
PC would use OR 2) an easy way to determine it
for himself. In addition, 3) he wanted to know the
current other devices would draw. Here's the
part of his post that addresses what he wants
beyond the approximation of what his PC uses:
"or if there is an easy way for me to determine
this on my own. (in U.S. with 120 V circuit).
I have other devices that I'd like to know also, such
as electronic music synthesizers and an audio mixer. "
It is the cheapest way he can measure the current
for himself of the PC and the other devices. It also
happens to be far more accurate than an approximation.
It avoids the safety problem of opening the AC supply
line and having 120 volts AC through the meter leads
& connections. DM described putting the resistor in
the neutral side inside a power strip, so the voltage
at the meter leads would be low. As far as I know, no
one in the thread described a safe way to use an inline
ammeter. A clamp on is more expensive (I'm hoping
you'll provide a cheaper source than I have found for
them) and the use of a clamp requires opening things
up to clamp on a single lead - again a safety exposure.
It is cheaper, safer and more accurate way to provide
answers to parts 2 and 3 of what he wanted - and
in answering part 2, he gets the answer to part 1,
only it is far better than an approximation.