Connect with us

Audio distortion measurement

Discussion in 'Electronic Design' started by Rob, Oct 5, 2004.

Scroll to continue with content
  1. Rob

    Rob Guest

    Hi all.
    Is there a simple way to measure distortion in a audio amp.
    I am looking to measure over a frequency range of say 200 - 2Khz.
    It will be a low power amp (intercom) , and obviously not hi-fi quality.
    I would just like to get a basic distortion measurement. Does
    not have to be anything fancy.Looking for a simple analog design.
    ie No computer / capture card / software.
    A notch filter method is fine but only works for one frequency , I am
    looking to have continuous coverage from 200 - 2000Hz.
    I get the feeling that what I need may not be "Simple" , but maybe someone
    out there has some ideas.
    Thanks
    Rob
     
  2. I read in sci.electronics.design that Rob <>
    It's not simple. But over the range 200 Hz to 2 kHz, it's unlikely that
    the distortion changes much with frequency. So a fixed-frequency notch
    filter would give you a pretty good guide. But tunable notch filters,
    with feedback Q-multiplication, aren't TOO difficult these days. Beware,
    though, that they can be very layout-sensitive, surprising at such low
    frequencies.

    200 Hz to 2 kHz isn't an optimum frequency range, even for an intercom,
    though. It is bass-heavy and lacks intelligibility, which is important
    for an intercom. I recommend 300 Hz to 3.4 kHz (telephone band) as a
    minimum, or 200 Hz to 4.5 kHz if you can do it.
     

  3. What John W. said. But, some other random thoughts:

    If you've got a low-distortion oscillator handy (at least, low enough that
    it's lower than what you want to measure), then all you need is a distortion
    analyzer. HP 331's and 334's will do the trick, and seem to go for between
    $50 and $200 on eBay, sometimes even cheaper, though be careful to get one
    that's known to be working. (Heck, I've got one I keep meaning to sell;
    contact me offline if you're interested and I'll give you a reasonable
    price.)

    Or, you could do a couple of notch filters, rather than just one. As John
    suggested, there's not going to be much difference between 200Hz and 2kHz;
    taking a measurement at 300Hz and one at 1kHz would tell you pretty much the
    same thing as a sweep would.

    Remember that if your system is bandwidth limited, and you test with a
    frequency near its limit, you will see very little distortion. The
    distortion products are higher frequency than the test signal (assuming
    you're looking for harmonic distortion, rather than intermodulation
    distortion), and they get filtered out by the limited bandwidth.

    For an intercom system, anything up to about 10% THD is probably going to be
    acceptable. With that much distortion, you can just about see it by eye on
    a scope; and you can certainly hear it with a trained ear.
     
  4. Tim Wescott

    Tim Wescott Guest

    Walter Harley wrote:

    -- snip --
    Ah -- but how do you train the ear without a distortion analyzer?
     
  5. Ken Smith

    Ken Smith Guest

    .... or a huge amount of distortion. Think about this circuit:

    VCC
    !
    Vin ----!+\ !/
    ! ----+--------+---! NPN
    --!-/ ! ! !\
    ! --- C ! !
    ! --- ! !
    ! ! R ! !------ To 8 ohm load
    ----------+-/\/\/--------+
    ! !
    ! !!-
    --!! P-MOSFET
    !!-
    !
    Vss

    For low frequencies, the distortion isn't too bad. As you get close to
    F=1/(2*PI*RC) the distortion gets very bad.
     
  6. rob

    rob Guest

    Thanks for the help guys.
    Rob
     
  7. just use your soundcard and
    http://audio.rightmark.org/index_new.shtml
    works fine
    The idea of using a analogue notch filter and LD oscillator these
    days is a bit old fashioned




    martin

    Serious error.
    All shortcuts have disappeared.
    Screen. Mind. Both are blank.
     
  8. john jardine

    john jardine Guest

    Maybe simplest is to build a tunable '2 opamp bandpass filter' set with a
    sharp 'Q' of maybe 10 or more. Audio tuning over 20:1 range by means of a
    radio style 2 gang variable capacitor. 200Hz-20kHz frequency range would be
    easily available.
    Essentially it would function as a manually tuned, audio wave/spectrum
    analyser as each audio distortion component could be measured/examined in
    turn.
    It's very easy to design using barely any bits and pieces. Jim Thompson has
    the 'DAPB' circuit etc on his web site.
    regards
    john
     
  9. dd

    dd Guest

    For a lo fi measurement employ
    2 channel scope on channel invert and add.
    Arrange that input to the amp displaces the scope trace approx the same
    as the output then adjust variable scope gain to zero the result on the
    screen.
    Destortion will show up as the amplitude that cannot be zeroed.
     
  10. Er, listen to someone speak?
     
  11. There is likely to be a residual fundamental due to phase shift in the
    amplifier. This is not non-linearity distortion.
     
  12. Ken Smith

    Ken Smith Guest

    If the phase shift isn't too big, this method still helps a lot to see
    distortion. You can knock the fundamental down by about 5X. This makes
    the distortion products stand out more.

    If you make a RC low pass filter using a pot, you can line up the
    fundamental and get a much greater reduction. The nice thing about doing
    that is it also can make the 3rd harmonic in your generator less of a
    problem.
     
  13. Tim Wescott

    Tim Wescott Guest

    That gives you "good" or "bad" but it doesn't give you "oh that sounds
    like about 10%". And my "good" may be your "bad".
     

  14. Well, my thinking was that if the OP was making an intercom, then perhaps
    the distortion spec was high enough that attentive listening would be
    adequate.

    For hifi gear, it is routine to push distortion levels down to the point
    where the human ear is *not* a good indicator except perhaps in very
    specialized circumstances with very trained ears. The average person
    certainly cannot hear the residual distortion in any decent hifi component
    (using an appropriately circular definition of hifi, anyway). So to work on
    that, you really need a good distortion analyzer. But for an intercom,
    "sounds good" probably equates to "less than 5% THD+N" which might be all
    the spec that's needed.
     
  15. The flip side of that is wandering around wondering "Is 10% good or bad?"

    Obviously, you need to do both to make a fully informed decision.

    RCA used a group of old secretaries and concluded anything above 6KHz
    was wasted, as was any distortion below 3%. Voila - the 8-track tape!
    That's why it's good to use a large number of listeners?

    An impartial answer can be obtained with one listener (don't follow RCA
    and Consumer Reports, send the listener to an audiologist for a hearing
    test first). Method: a live source and an electronic source are hidden
    behind a drape. When they sound the same - that's good enough.
     
  16. Use a square wave for a source - damn good distortion analyzer. The
    scope trace will tell all.

    TTTH: I really appreciate it when the folks designing such a product take
    the time and care to really get the distortion down to the extent that
    it sounds like the other party is right there with you.

    Quality in a product _does_ make a difference. Quality has to be
    _designed_ in and can not be creating in manufacturing or by inspecting
    product.
    That's what I thought, till I actually tried a double blind test on an
    amplifier in my living room.

    The setup was a Crown IC150/DC150 amplifier pair (I'm showing my age
    here), AR9 speakers, Sheffield labs direct to disc record, Shure V15-???
    cartridge and Kenwood 720 turntable (? the one made from concrete and
    rosewood).

    The DUT was a $500 Kenwood A/V receiver. The Crown's 0.0012% distortion
    spec Vs the Kenwood's 0.1% - like who's going to hear that, right? I was
    all set to sing the praises of Kenwood...

    It wasn't even close - no need for the double-blind switch. No contest.
    Dirty eyeglasses Vs clean eyeglasses.

    I would not have said the ear was such a good discriminator as an
    HP distortion analyzer working at the noise floor. But it does seem
    so.

    Now, I'm no golden-eared prissy - my speakers are hooked up with a long
    length of 18ga lamp cord, I use WD-40 as a contact cleaner, you know,
    a 'regular' high-fi nut.

    This was a sample of one. And, yes, I inhaled in the '60's.
    Don't bet any money on this ...

    But, as a result I have been converted to the "good enough specs ain't all"
    school of engineering

    And the realization that a human is the only final arbiter.
     
  17. I read in sci.electronics.design that Nicholas O. Lindan <>
    You may have been hearing a difference due to something totally
    different from non-linearity distortion.
     

  18. I have a couple of comments.

    First: 0.1% distortion is audible to an attentive listener in an appropriate
    environment. I would not say the same of 0.01%. It is relatively easy to
    push amp distortion under 0.01% these days. (Speaker distortion is another
    story, of course.)

    Second: the amps differed on their *specs*, not their measured performance.
    Did you actually measure the distortion of the amps under the operating
    conditions? Perhaps the Crown wasn't working. Perhaps the Crown's spec was
    bogus. Perhaps the Kenwood was conservatively rated.

    Third: what HP distortion analyzer? An HP 331A, decent but old machine,
    can't measure below about 0.1%. An ear is definitely better than a 331A
    where it can be used, although in many situations (such as very high power
    or into non-loudspeaker loads or at high frequencies) an ear is not useful.

    Fourth: you say you "actually tried a double blind test." But you go on to
    say "no need for the double-blind switch." So, it sounds like you *didn't*
    actually try the double-blind test?

    Fifth: THD+N is one spec on which amps can differ. But it is not the only
    one. Frequency response, slew rate, IMD...

    Sixth: how did you calibrate the amps to achieve the same output level? A
    1dB difference in volume can be perceived as a noticeable difference in
    sonic quality, even when it is not perceived as a difference in volume per
    se.

    Your test does not demonstrate what you seem to feel it does.

    -walter
     
  19. Yeah, that's what I found out.
    The post was about listening, not measuring. Kadzooks man,
    unclench that sphincter.
    I think you have this bass-akwards.
    Er, know what a Crown amp is? I assume you do. Kinda dumb statement, eh?
    That's why it sounded bad?
    Oh, bloody hell, a Macaroni 2000ZX then.
    We agree.
    Didn't need to: that was the whole effing point. The difference was obvious.
    You hear the one about the Grandmother and the eggs?
    You are absolutely right. I couldn't have said it better myself.

    Hey where's 7?
     
  20. No, that would be messy.

    Yes; sorry. (No need to assume; I am on record, as a bit of Googling will
    indicate, as being a big fan of Crown.) But presumably you get the point:
    the specs don't mean diddley, if what you're interested in is what level of
    distortion people can hear, which was the original point of contention.
    What matters is how the two amps were actually performing at the time.

    This is where we part ways, I think. There are just too damn many people
    who have made the same mistake. If you didn't do the double blind
    experiment, then all we are talking about is your expectations. I'm not
    trying to deny your experience; but I am saying that it is meaningless and
    useless to me, or to anyone else.

    To spell it out: You're saying that a well-rated and well-reputed amp
    sounded better, in a non-blind situation, than a lesser-rated and
    lesser-reputed amp. Big deal. Of course it did. Perhaps for good reason,
    and perhaps not.

    Look, I personally believe you - I believe that the Crown beat the Kenwood.
    If I didn't believe you, then I wouldn't have spent all the time and money I
    have on getting the distortion as low as it is on the little headphone amp I
    sell, and I wouldn't own all the Crown amps that I do. But the bottom line
    is that what you're saying is purely anecdotal, unless there's more to it
    than you're revealing.

    Hey, I lived in Peninsula for a while... just down the road from the
    Richfield Coliseum, when it still existed.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-