Connect with us

Multimeter accuracy -- 3d, 5d, 10d?

Discussion in 'General Electronics Discussion' started by Justinicus, Jan 17, 2014.

Scroll to continue with content
  1. Justinicus

    Justinicus

    5
    0
    Jan 17, 2014
    Howdy folks,

    I've been searching for hours, and can't find out what these terms mean (3D, 5D, 10D). I figure it must be something so basic, no one ever bothers to define it.

    I just picked up another cheapo 3.5 digit DMM to quickly check battery voltage and continuity without having to get out my "good" model. I was reading through the manual (I've never looked at a DMM manual before!) and came across a new term in the accuracy specs. Between 200mV and 200VDC, its accuracy is +- 0.5% +- 5D. At 200VAC, its accuracy is +- 1.2% +- 10D.

    I thought it might be significant digits, but that doesn't seem hold up.

    Any help for the noob?
     
  2. duke37

    duke37

    5,364
    769
    Jan 9, 2011
    My interpretation is that at 200V AC, it should read 200.0
    It could read 1.2% high or low, i.e. 197.6 to 202.4
    Add or subtract 10 digits gives 195.6 to 203.4

    Check it on any critical range with your posh meter.
     
  3. KrisBlueNZ

    KrisBlueNZ Sadly passed away in 2015

    8,393
    1,270
    Nov 28, 2011
    "±10D" means ± ten times the value of the least significant digit. So it's like Duke said except that the final range should be 196.6~203.4.
     
  4. jpanhalt

    jpanhalt

    426
    4
    Nov 12, 2013
    Interesting how confusing such specifications can be, isn't it?

    If we are talking about a 3-1/2-digit meter, it cannot display "202.4." It should read "out of range". 199.9 is the largest 3-1/2-digit value. I am assuming the calculation that was shown was just hypothetical to make the point. :)

    As for the other part, e.g., a % error and a ± digit error, I agree, but thought there was an added caveat that whichever error calculation gave a larger error was applied. Some way to arbitrate the two notations is needed, because they will not always agree.

    John
     
    Last edited: Jan 18, 2014
  5. KrisBlueNZ

    KrisBlueNZ Sadly passed away in 2015

    8,393
    1,270
    Nov 28, 2011
    AFAIK the maximum error is specified as the SUM of the percentage and least-significant-digit errors. I haven't seen it specified as "whichever is higher". This is what duke37 showed in his calculations.
     
  6. jpanhalt

    jpanhalt

    426
    4
    Nov 12, 2013
    I need to read about it some more. A reading of 003.0 V with a 2% /10D meter in theory could be 3.0 ± .06 and ± 1V, which doesn't make sense considering the errors additively (i.e., it could read as high as 004.1V or as low as 001.9V). Even considering the greater of the errors doesn't make sense.

    Is it auto ranging so that 003.0 is always read as 03.00?

    John

    EDIT:
    I stand corrected. Here is the statement from Fluke:
    Capture1.PNG

    I guess I would have second thoughts about buying a ±10D meter.

    John
     
    Last edited: Jan 19, 2014
  7. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,477
    2,820
    Jan 21, 2010
    It depends on the source of the error.

    If it is random then the last digit is almost meaningless, however if this is a limit to accuracy, but not to precision then whilst you can have no confidence in that last digit being correct, you can still rely on changes in that last digit meaning a change in the reading.

    As an example, I have a very old Fluke meter (nixie tube display!) that allows me to set the interval over which the output is averaged. At high refresh rates the 5th digit is all over the place, as you slow it down, the variation in the readings get smaller and smaller. Whilst the reading may be no more accurate, it is a lot more precise.
     
  8. jpanhalt

    jpanhalt

    426
    4
    Nov 12, 2013
    Your example is not quite the same thing. I thought we were discussing a device where you do not have that option of trading off response time for accuracy.

    I, too, can name instruments whose accuracy compared to the majority of other instruments on the market was off by more than 100%, which was a function of the technology, but whose precision was the best on the market. Their precision and speed is what made them useful. Those instruments were for measuring enzymes for which universally accepted standards did not exist.

    So far as I know, volts are volts. There is insignificant disagreement on the definition. :)

    John
     
  9. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,477
    2,820
    Jan 21, 2010
    It doesn't matter. What I was talking about is repeatability.

    Sure, the meter may be relatively inaccurate, but it may be highly precise.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-