Connect with us

DAQ front end design

Discussion in 'Electronic Design' started by Chris Carlen, Oct 5, 2004.

Scroll to continue with content
  1. Chris Carlen

    Chris Carlen Guest

    Hi:

    I'm planning to design a DSP board based on the TMS320F2812. I want to
    add a better A/D, so am in the process of designing a front-end.

    The required specs are 14-16 bits, at least two simultaneously sampled
    channels at a time, with 8 channels total. Preferably differential
    inputs, up to 100kHz full-power bandwidth and at most 4us to convert two
    channels. Some configurability of input voltage ranges: unipolar ranges
    of 10V, 5V, 1V, and 0.2V; bipolar ranges of +/-10V, +/-5V, +/-1V, and
    +/-0.2V. It's ok for the bandwidth to be reduced somewhat when using
    gain greater than 1.

    As for the A/D I'm thinking of utilizing the Maxim MAX1324 8-channel
    simultaneous 250ksps 14-bit converter with +/-10V input range. This
    chip also has variants with +/-5V and 0-5V inputs.

    I'd like to keep component count as low as possible. Three SO8 sized
    chips per channel is the limit. Yet I want to be able to calibrate each
    channel to have no more than 1-2LSB of offset, and no more than 0.05%
    gain error (8 LSB out of 14 bits). I don't need software
    configurability or calibration. Jumpers and pots is Ok.

    My stumbling block is that to handle the +/-10V input range, the
    traditional instrument amps like LT1167 will slew rate limit above about
    19kHz. The use of such an amp is desirable since it is a convenient
    place to inject an offset correction, as well as switching the overall
    gain ranges. My planned front end topology is as follows (a separate
    input signal chain per channel):

    A differential input stage such as the LT1167 instrument amp,
    followed by a 4-pole Bessel filter (this might be just a socket,
    allowing pluggable filter modules to be inserted), followed by a fast
    settling buffer amp to drive the A/D. In order to switch between
    bipolar and unipolar input voltage ranges, the A/D driver amp will be in
    a non-inverting configuration. It can be switched between a unity-gain
    buffer (for bipolar inputs) or a x2 amplifier with -10V offset (for
    unipolar inputs) by simply jumpering a resistor from the - terminal to a
    +10V reference (it will have an equal valued feedback resistor, of
    course). The only thing missing in this plan is a good way to tweak
    overall gain to calibrate to my desired spec. That could be done by
    tweaking the gain in the filter stage, though, or by tweaking the LT1167
    gain resistor. The latter method wouldn't work for the 10V ranges
    though. I plan to inject an offset adjustment to the LT1167's reference
    terminal.

    The MAX1324 has some offset and gain adjustment facilities, by using an
    external reference and a "MSV (mid scale voltage) input pin. But the
    channel to channel gain and offset error matching specs are less than my
    requirements, so that's why I'd like to be able to individually
    calibrate each channel for offset and gain.

    It seems the only way to get fast differential inputs would be to roll
    my own instrument amp stage.

    Is there a better way?

    On obvious question would be whether differential inputs are really
    needed. The inputs are likely to be fed by signal sources external to
    the DSP board, and carried over cables. In these situations and where
    DC accuracy and low noise is required, I have always thought it
    necessary to have differential inputs.

    Thanks for inputs (no pun intended).




    --
    _______________________________________________________________________
    Christopher R. Carlen
    Principal Laser/Optical Technologist
    Sandia National Laboratories CA USA
    -- NOTE: Remove "BOGUS" from email address to reply.
     
  2. Tim Wescott

    Tim Wescott Guest

    If you're using a DSP why not apply your gain and offset corrections in
    software? That'll significantly reduce your component count, and the
    '2812 is one fast processor so it can take it. As long as your input
    circuits and ADC's don't change too much over time your calibration will
    hold just fine.

    For DC sensitive measurements over long cables I would certainly use
    differential inputs. If you _really_ want to lean on the DSP you may
    consider running the + and - inputs into individual ADC channels and
    doing the subtraction in software -- this won't do you any good if it
    makes your signals leave common-mode range, and it isn't the right
    answer in every case, but I've seen it used with great success.
     
  3. Joerg

    Joerg Guest

    Hi Chris,
    I have never used differential inputs in such a system unless the signal
    came prescribed that way, such as in high end audio apps. Mostly this
    is done to avoid ground loops but for noise a nice good coax plus some
    common mode toroids usually do the trick. Unipolar you could use any
    opamp that is powerful and fast enough.

    Also, typically such converters need to be fed from a very low impedance
    to avoid digital chatter entering through the front.

    Regards, Joerg
     
  4. Chris Carlen

    Chris Carlen Guest

    I had thought of that, but it can't deal with the possibility of a
    negative offset, ie, where it takes several LSBs of input signal voltage
    before the code changes from 0x0000 to 0x0001.

    It also can't deal with too high of a gain, where code 0x3FFF is reached
    before the maximum signal is reached, and the offset is correct.

    Isn't this correct? Software calibration would be useable where the
    available code range is wider than the range intended to be used
    including all possible errors.
    Hmm.

    Thanks for input.

    Good day!



    --
    _______________________________________________________________________
    Christopher R. Carlen
    Principal Laser/Optical Technologist
    Sandia National Laboratories CA USA
    -- NOTE: Remove "BOGUS" from email address to reply.
     
  5. Tim Wescott

    Tim Wescott Guest

    Correct. You need to deliver data to the ADC that is in range -- unless
    you're just barely scraping by for ADC resolution you ought to be able
    to insure this with conservative design.
     
  6. Chris Carlen

    Chris Carlen Guest

    I thought most commercial DAQ hardware was designed so that the full
    code range mapped exactly to the analog input voltage range?




    --
    _______________________________________________________________________
    Christopher R. Carlen
    Principal Laser/Optical Technologist
    Sandia National Laboratories CA USA
    -- NOTE: Remove "BOGUS" from email address to reply.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-