Maker Pro
Maker Pro

Calibration Of Electronic Equipment In The Home Workshop

M

MassiveProng

Jan 1, 1970
0
The real question is how much precision do you
really need in the home "lab"? How often have
you needed to use your DMM with how many
*accurate* significant digits? 100 minus some
*very* small percent of the time, 2 significant
digits is all you need. Do you _really_ care
if your 5.055 volt reading is really 5.06 or 5.04?

Oh hell yes, I want to puff out my chest like everyone
else and think I have *accurate* equipment.

But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not
*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?

None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so
forth. But for myself, I need a dose of reality
from time to time when I start drooling over some
accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.


Modern instrument accuracies are so good, and keep their setup so
well, to open one up and tweak it with less than a professional
calibration standard available is ludicrous in the extreme.

No matter how smart one is, if one has an instrument, and wants to
test accuracy, one should make an appearance somewhere where an
already recently calibrated instrument is available to EXAMINE your
instrument against.

NONE should be "adjusted" at all ever if the variance is too small
to warrant it, and even pro calibrators follow this creed. If at all
possible, their main task is to VERIFY an instrument's accuracy
WITHOUT making ANY adjustment. ANY that DO need adjustments are
typically marked "defective" and require a factory inspection/repair.

I speak from experience, so I don't care what the ToolTard thinks
about his capacity for the task, he is a fucking retard if he tries it
without first checking his gear against known good gear.

It really is THAT SIMPLE.
 
A

Anthony Fremont

Jan 1, 1970
0
MassiveProng said:
It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter. Now that the number are back where they belong, please
procede to restate your case. The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......
 
M

MassiveProng

Jan 1, 1970
0
So you don't know how to access the service menu and make changes to the
setup of your boob tube.


Sorry, you dumbfuck, but you assuming that all TVs have this
capacity proves even further how little you know about it.
 
M

MassiveProng

Jan 1, 1970
0
Good for you, please explain how the OP was going to use a DVD to
calibrate his test equipment.


Stupid shit. The suggestion I posed mine against was some twit
suggesting WWV and a 1kHz tone, which is about as old hat as it gets.

You should really learn to read ENTIRE threads before you mouth off,
jackass.
 
M

MassiveProng

Jan 1, 1970
0
No, you don't, all the adjustments are done via menu now.

Wrong again, dumbass. You'd like to think that your guess is
correct, but it is not, dipshit.
 
M

MassiveProng

Jan 1, 1970
0
I think, you just rant. Please get it right. Maybe you could use that
DVD to calibrate your anger response, maybe you could eBay it and your
home audio system to pay for some anger management?


**** you, you fucking retard. Meet up with me, and I'll show you
how I manage it.
 
M

MassiveProng

Jan 1, 1970
0
Good comments Ed.

I want to thank everyone else who has offered *positive* comments
also.

I want you to leave the group and never return, you top posting
Usenet RETARD!
Like I said, I think this is a need for anyone who has equipment at
home.

Like I said, anyone as dumb as you are, regardless of your "tool
count", should not be futzing with perfectly good instruments.

You are simply too fucking stoopid to do it correctly.

Yes, YOU!

Learn about top posting, asswipe, and how it is frowned upon in
Usenet, or are you just another pants down past the asscrack, gang boy
retard?
 
M

MassiveProng

Jan 1, 1970
0
The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.
]
Nope. READ HIS replies. He was talking about using a 3% meter.
Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dipshit, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.
The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded bastard said, you retarded bastard.
 
T

The Real Andy

Jan 1, 1970
0
The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.
]
Nope. READ HIS replies. He was talking about using a 3% meter.
Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dipshit, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.
The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded bastard said, you retarded bastard.

Ahhh, who actually uses a scope to make accurate measurements?
 
J

Jim Yanik

Jan 1, 1970
0
The "basic fact" here is that we were talking about adjusting a 3%
scope with a .03% meter. Now that the number are back where they
belong, please procede to restate your case. The scope's vertical
sensitivity could easily be adjusted to within 3% using said meter,
now can't it? Just like Keith says......

Actually,one CAN calibrate an instrument to a greater accuracy than it's
specified accuracy,-for a short time-;it's called a transfer standard.
Of course,there are limits to how much greater accuracy you can
achieve,based on resolution and repeatability.

For ordinary cals,your standard should be at least 4x better than the DUT.
10x is great.
 
P

Phil Allison

Jan 1, 1970
0
"The Real Andy"

Ahhh, who actually uses a scope to make accurate measurements?


** Anyone who needs to.

Low frequency ( 1 to 30Hz), single shot, asymmetrical or pulse waves and
high frequencies are all " grist for the mill " even with a CRT based
scope.

Shame what happens with a DMM used on the same.



........ Phil
 
D

David L. Jones

Jan 1, 1970
0
It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

LMAO!
If I use 0.5% accurate meter to adjust a something, then the accuracy
of that adjusted device at that point in time at that adjusted value
*becomes* 0.5%. The device that was adjusted only gets it's accuracy
figure of 0.5% *after* the adjustment. The 0.5% of the device does NOT
get added to the 0.5% of the meter in this particular case!

Dave :)
 
M

MassiveProng

Jan 1, 1970
0
MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones" <[email protected]>
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave :)

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.
]
Nope. READ HIS replies. He was talking about using a 3% meter.
Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dipshit, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.
The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded bastard said, you retarded bastard.

Ahhh, who actually uses a scope to make accurate measurements?


I guess the same idiots that claim they can calibrate one with a 3%
meter.

Also, if you do NOT know how to make accurate measurements with
scopes, you should be in some other industry.
 
M

MassiveProng

Jan 1, 1970
0
LMAO!
If I use 0.5% accurate meter to adjust a something, then the accuracy
of that adjusted device at that point in time at that adjusted value
*becomes* 0.5%.

Absolutely incorrect!

If you do that, the MINIMUM error is 0.5%. It is ALWAYS greater
than that value by that value plus the error of the device you think
you set.

How can you not understand that basic fact?
The device that was adjusted only gets it's accuracy
figure of 0.5% *after* the adjustment.

Absolutely INCORRECT!

The error of a device is NOT tied to how it got set or what it got
set with, dipshit, it is tied to precision of the circuits the device
are based upon.
The 0.5% of the device does NOT
get added to the 0.5% of the meter in this particular case!

Wanna bet?
 
E

ehsjr

Jan 1, 1970
0
Anthony said:
You surely didn't mean tens of _mA_, did you?

I surely meant tens of mA.

I build stuff with PICs as
you know, and some of it is designed to run on batteries and needs to go for
long periods of time unattended. The current draw for a 12F683 running at
31kHz is 11uA, sleep current is 50nA. If I could only measure current to
"tens of mA", I'd never know if the PIC was setup right for low current draw
and I certainly couldn't have any idea of expected battery life. I wouldn't
even know if it was sleeping until it ate thru some batteries in a few days
instead of six or eight months. I think I have a need to measure fractions
of a uA.

You may, but not accuracy below the tens of _mA_ digit.
When you need accuracy below tens of mA, you measure
voltage across a resistance. It doesn't make a lot of
sense to look for your meter to be accurate to 8 decimal
places for your .00000005 amp reading.

Here's how you do it with accuracy at the tens of _mV_ digit:

For 11 uA, put a 10K .01% resistor in series with
the supply and measure .11 volts across it. The voltage
would range from 0.109989 to 0.110011. Keep only
2 decimal places. Your computed current, worst case,
would be off by 1 uA

For 50 nA, use a 2 meg 1% resistor and measure .10
volts across it. The voltage would range from .099
to .101 taking the 1% into account. Throw out the
last digit. Your current computation would be off
worst case, by 5 nA.

With a voltmeter accurate to 2 decimal places.
I don't know why you would
When he needs it he needs it, what can I say?

I asked, looking for concrete cases. Your case
with the PIC is an excellent example of when a
person needs to know about really small currents.
It definitely fits into the difference I had in mind
between "needs" and "wants". But it does not mean he
needs accuracy out to 8 decimal places. He needs it to
2 decimal places, as was shown. Three decimal places
would be nice. :)
Do I really "need" a new DSO?

I have no opinion on that, and it would be irrelevant
if I did. I don't know what your situation is.
Well I've managed to get by all this time without one, so maybe you think I
don't really "need" one. I see it like this though, I don't get allot of
time to tinker anymore. I'd like to spend it more productively. Instead of
fumbling around and trying to devise silly methods to make my existing
equipment do something it wasn't designed to (like going off on a tangent to
build a PIC circuit that will trigger my scope early so I can try to see
some pre-trigger history).




I don't know if I really agree with that. ;-)

Well, you're free to argue against having the best
instrumentation you can afford, or having references
to check it against or getting it calibrated or
whatever, if that's how you feel. I tend to err on
the side of wanting the best even when it is
not the best fit for what I really need.

Ed
 
M

MassiveProng

Jan 1, 1970
0
You may, but not accuracy below the tens of _mA_ digit.
When you need accuracy below tens of mA, you measure
voltage across a resistance. It doesn't make a lot of
sense to look for your meter to be accurate to 8 decimal
places for your .00000005 amp reading.


ALL handheld meters use voltage read across a precision shunt
resistor for current readings. I am not talking about inductive
probes. Standard current.
 
T

The Real Andy

Jan 1, 1970
0
MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones" <[email protected]>
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave :)

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.
]
Nope. READ HIS replies. He was talking about using a 3% meter.

Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dipshit, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.

The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded bastard said, you retarded bastard.

Ahhh, who actually uses a scope to make accurate measurements?


I guess the same idiots that claim they can calibrate one with a 3%
meter.

My point exactly.
Also, if you do NOT know how to make accurate measurements with
scopes, you should be in some other industry.

More to the point, if you dont understand the concept of error then
you should be in another industry.
 
M

MassiveProng

Jan 1, 1970
0
On Sat, 03 Mar 2007 19:21:44 -0800, MassiveProng

MassiveProng wrote:
On 2 Mar 2007 15:09:30 -0800, "David L. Jones" <[email protected]>
Gave us:

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave :)

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter.
]
Nope. READ HIS replies. He was talking about using a 3% meter.

Now that the number are back where they belong, please
procede to restate your case.

**** you. Read HIS criteria, dipshit, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.

The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......

That is NOT what the retarded bastard said, you retarded bastard.

Ahhh, who actually uses a scope to make accurate measurements?


I guess the same idiots that claim they can calibrate one with a 3%
meter.

My point exactly.

Good thing I never made that claim.
More to the point, if you dont understand the concept of error then
you should be in another industry.

That is about the gist of what I have been trying to tell them.

Some dope thinking he can adjust his meter accurately with a damned
drifty voltage reference chip should have his head examined, not his
instruments!
 
T

The Real Andy

Jan 1, 1970
0
So it sounds like you are having a problem finding two brain cells
MiniPrick...try harder.

No more of your excuses....SHOW us how great you are.

Laugh...laugh...laugh....

TMT

At the end of the day, why do you need to calibrate your instruments?
Do you need to do it, or is it just for self satisfaction? Are your
trying to prove a point or do you need traceable calibaration? What
are you trying to acheive?
 
A

Anthony Fremont

Jan 1, 1970
0
MassiveProng said:
The "basic fact" here is that we were talking about adjusting a 3%
scope with a .03% meter.
]
Nope. READ HIS replies. He was talking about using a 3% meter.

I believe we were talking about scopes only being about 3% accurate in the
vertical. You are the one that pulled that garbage out of the air about
leaving the scope 6% off. You should try reading what people write instead
of what you wish they wrote.
**** you. Read HIS criteria, dipshit, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So **** off.

Are you really that incompetent? I AM THE ONE that stated that I could use
..03% meter to adjust it. It was in my very first post in this thread. Now
stop lying.
 
Top