Connect with us

Biasing ADC input

Discussion in 'Electronic Basics' started by thomas, Oct 22, 2004.

Scroll to continue with content
  1. thomas

    thomas Guest

    I have a AD594 temp sensor (~10mV/C output) connected to the 10bit ADC on an
    AVR. The sensor output will be -375mV to +1015mv for the temperature range
    (-40C to +100C) I'm interested in. The ADC only accepts +V analog input so
    I'll need to bias the signal 'up' a little to get it above 0V. If possible
    the solution should be repeatble in production and not require individual
    calibration.

    Specs:
    Using a TC7660 or TCM829 for the negative supply to the AD594.
    I'm using the internal 2.56V Vref of the AVR ADC. Vref is exposed on a pin.
    The AD594 can source 5ma on its output.
    The ADC has 100M input resistance.

    Googling came up with the following suggestions:
    #1)Add a low voltage reference in series to temperature sensor.

    #2) Try connecting the output of the temp sensor to the ADC through a 1k
    resistor. Take another 1k resistor and connect it between the reference
    voltage and the ADC input. This may not work if your temp sensor will not
    sink enough current.

    #3) Many ADC's have voltage reference outputs that can be used to bias the
    analog input(s) of the ADC. This can be done by connecting a resistor from
    the ADC reference output to the analog input. The Voltage reference output
    can be bypassed to analog ground with a small capacitor to improve the
    ripple rejection. The bias resistor value can be selected based upon the
    ADC input leakage current. Consider a resistor value such that the maximum
    ADC input leakage current alone causes less than a 1 LSB voltage across the
    bias resistor. In this way the "offset code" of the ADC will not be overly
    input bias current dependent.

    #1 seems straight forward enough, however, #2 only requires two resistors
    and #3 only one.

    Questions:
    - Is one or is two resistors the better solution?
    - #3 gives specifics on selecting the resitor value. Any help with this, I'm
    lost?

    thx,
    t
     
  2. peterken

    peterken Guest

    Easiest and most accurate solution is using the "voltage divider" with two
    resistors connected to the reference.
    Use 0.1% resistors.

    #1 requires an extra reference, including its tolerances, thus requiring
    calibration
    (AD has its own internal reference, thus compensating its own
    measurments)
    #2 would be the solution (if ADC doesn't sink enough, take for example order
    of 10k)
    Any arising voltage drops due to the R- network can be incalculated in
    software
    #3 has the disadvantage of different leakage currents for every device, thus
    requiring calibration


    I have a AD594 temp sensor (~10mV/C output) connected to the 10bit ADC on an
    AVR. The sensor output will be -375mV to +1015mv for the temperature range
    (-40C to +100C) I'm interested in. The ADC only accepts +V analog input so
    I'll need to bias the signal 'up' a little to get it above 0V. If possible
    the solution should be repeatble in production and not require individual
    calibration.

    Specs:
    Using a TC7660 or TCM829 for the negative supply to the AD594.
    I'm using the internal 2.56V Vref of the AVR ADC. Vref is exposed on a pin.
    The AD594 can source 5ma on its output.
    The ADC has 100M input resistance.

    Googling came up with the following suggestions:
    #1)Add a low voltage reference in series to temperature sensor.

    #2) Try connecting the output of the temp sensor to the ADC through a 1k
    resistor. Take another 1k resistor and connect it between the reference
    voltage and the ADC input. This may not work if your temp sensor will not
    sink enough current.

    #3) Many ADC's have voltage reference outputs that can be used to bias the
    analog input(s) of the ADC. This can be done by connecting a resistor from
    the ADC reference output to the analog input. The Voltage reference output
    can be bypassed to analog ground with a small capacitor to improve the
    ripple rejection. The bias resistor value can be selected based upon the
    ADC input leakage current. Consider a resistor value such that the maximum
    ADC input leakage current alone causes less than a 1 LSB voltage across the
    bias resistor. In this way the "offset code" of the ADC will not be overly
    input bias current dependent.

    #1 seems straight forward enough, however, #2 only requires two resistors
    and #3 only one.

    Questions:
    - Is one or is two resistors the better solution?
    - #3 gives specifics on selecting the resitor value. Any help with this, I'm
    lost?

    thx,
    t
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-