Connect with us

ADC mux charge injection on commercial DAQ boards

Discussion in 'Electronic Design' started by CC, Sep 30, 2007.

Scroll to continue with content
  1. CC

    CC Guest

    Hi:

    A couple years ago at work our computer programmer specified Data
    Translations DT3010-268 boards for our lab's high speed data acquisition
    tasks. I was assigned the task of building a BNC breakout panel for all
    the DT3010's signal ins/outs.

    This didn't involve any electronics design, but rather just wiring. I
    made a 4 layer PCB anyway to keep analog and digital signals over
    respective ground planes, and since multiple labs had to be equipted
    like this, it was more efficient than hand wiring each one.

    Since I knew there are always gotchas with these off-the-shelf boards, I
    designed in 7-pin SIL sockets near each analog input BNC to accept a
    future analog input buffer module, in case it would be needed. I put in
    buffer sockets for analog outs and the digital IOs too. Then I just
    jumpered them all for the installation.

    This seemed like a good compromise rather than building buffers into
    each channel by default, which would have made the board design and
    assembly take much longer than just a simple PCB wiring harness.

    As anticipated there were troubles. When they started evaluating the
    performance of the DT3010, they found that as sample rates increase, the
    digitized signals incur very large errors. Some commercial
    calibration instruments such as Keithley Source Meters cannot drive the
    DT3010 inputs under any circumstances and get correct readings.

    I got involved and scoped around until I concluded that the DT3010
    directly multiplexes the sample voltage holding capacitor from one
    channel to another. This cap appears to be in the range of 120pF. So
    as the board switches channels, the prior channel's voltage gets ejected
    out of the input BNC of the next input. The settling time thus depends
    on the combination of the cable length and impedance leading to the
    board, and whatever driver is feeding that cable. Typical instrument
    voltage outputs can't settle fast enough (about 600ns from mux switch to
    hold) to give accurate readings.

    The two labs using these new boards worked around the problems and other
    tasks took priority over building buffers, until a few days ago. Now we
    have a lab that simply can't get measurements done with the present
    level of performance. The scientist has been running at the lowest
    possible sample rates and using as few channels as possible to get
    reasonable data. But now he needs something better, so I built analog
    input buffers consisting of a LT1167A instrumentation amp for the
    inputs, followed by a LT1220 fast settling opamp in simple follower
    configuration to feed the DT3010 inputs.

    I am fairly confident this buffer will solve the problem when I assemble
    and test it next week.

    But I told the programmer who continually specifies Data Translations
    products that I think the DT3010 is "broken" for not including input
    buffers on each channel to isolate the sampling capacitor from the
    outside world.

    He became very agitated and said that it's not broken because all the
    other makers do the same thing (which I am not certain about). That
    logic escapes me in any case. He did agree when I rephrased it as, "the
    DT3010 is a cheap-sh** design" But then he went on to accuse me of
    having too high standards (which he then insulted with further use of
    the sh** word).

    I simply stated that I have a lab that is broken and can't record data
    at the DT3010's specifications, and that I am going to fix it. He
    cannot answer the question of how to fix it himself, because he isn't
    able to design analog circuits. But he accuses me of having too high
    standards and that I over design everything by 10x simply because I
    think that a DAQ system should work as advertised.

    Actually, if I had not designed in the hooks to allow inserting analog
    buffers into my PCBs, I would be faced with having to re-do the entire
    panel, or building another box containing analog buffers.

    The programmer also stated that for every job he does, his goal is to do
    the absolute minimum work to get something working. And if 90% of the
    time there aren't problems which come back to him, then he has done too
    much. The consequence of this is that the expensive lab time and the
    time of scientists get expended in finding and solving problems. But he
    takes credit for getting jobs done (which really aren't "done" at all)
    very quickly and efficiently.

    I have the opposite philosophy. I spend a large amount of time in my
    workshop designing things to be nearly perfect before delivery to the
    lab, because I believe it is my job to use my time to find and resolve
    the problems before installation so that precious lab time isn't wasted.
    I have never received a complaint that it took too long, only
    compliments that my stuff works perfectly.

    This difference of view has now led to quite a bitter state of conflict
    between us, which is unfortunate. I'm not sure how it will play out. I
    am pissed because now I fear this guy has bad-mouthed my work in the
    past as taking too long to complete and over-engineered. So he makes
    himself look like he's prolific and I'm not. But I'm the one who winds
    up with the job to make his partially functional work reach full
    functionality. And I get put down for doing it "too well."

    I maintain my position that the DT3010-268 is "broken." Would you agree
    or disagree? Should it be the job of the customer of a commercial DAQ
    board to have to build analog input buffers on every channel before the
    board can be used to specification for any other than "battery" voltage
    sources located a few inches from it's inputs?

    Good day!
     
  2. John Larkin

    John Larkin Guest

    All CMOS multiplexers kick out charge when they switch, plus whatever
    signal is loaded onto whatever downstream capacitance there is, in the
    ADC front end, gets switched between channels, too. So the inputs are
    hardly simple, passive loads.

    Lots of opamps go nuts when hit by a mux spike, and can take a long
    time to recover. They may rectify the spike somehow, and then a high
    sample rate winds up creating a big DC offset.

    The easiest fix is a series R-C just ahead of the mux; something like
    100 ohms, which most opamps can drive, and as much C as the signal
    bandwidth can stand, some number of nF at least. The charge injection
    can be treated as an average DC current, which produces an offset
    error into the 100 ohm resistor, so this can have problems, too,
    especially at very low signal levels.

    Another trick is to select a dummy channel, grounded or Vcc/2, between
    active channels. That keeps channel-channel crosstalk down.

    People who design data acquisition boards should at least use very low
    charge-injection mux's (as opposed to very cheap ones) and buffer
    properly downstream, to keep the crosstalk capacitance low.

    Nowadays, you might almost as easily go with an ADC per channel.

    John
     
  3. This is more than just a design philosophy difference. It's driven by
    some fundament real and percieved difference between software and
    hardware.

    Software is a lot more malleable and flexible (especially over time)
    than hardware. As a result software jobs often never finish, there is
    always one more small feature to add or a 'simple' way to have the SW
    work around a HW limitation. In addition because of the perceived
    simplicity of changing the SW, the specifications for SW are often
    fuzzy, incompletely thought out and more grandiose than is necessary to
    do the required job. These characteristics of SW specifications make
    it a good idea to complete SW that does the most useful and important
    fraction of the job and have used before attempting to finish. Often
    the rest of the specified functionality is unnecessary or completely
    different functionality will be more useful.

    This should be a lot less true for a well established product area, but
    I would expect it to be true in spades for a laboratory environment.

    Now what he does deliver should have the goal of being bug free, but
    there are a lot of advantages to delivering short of being functionaly
    complete. A good design will be done with an eye to exapnding
    capabilities in the future, an excellent design will succed in having a
    sufficiently general base that future modifications fit in cleanly, an
    exceptional design will be so simple and clean that it doesn't require
    changes since additional functionality can be built out of the existing
    functionality.

    Working in hardware you don't have the same luxury of incrementally
    adding feature ad-infinitum and you will usually have a more solid
    specification to start from.
    That is unfortunate. The forces driving one disciple should not be
    imported into another.
    Sympathise, but disagree.
    Their job is to specify what the inputs are including frequency
    capability and drive requirements. Yours is to determine how to
    translate from your source characteristics to their input
    characteristics.

    Robert
     
  4. CC

    CC Guest

    Well said.
    Yes. In my case the specs are rarely quantitatively stated when I'm
    initially told what the job is. I then work with the PIs to pin it down
    or else make some reasonable guesses based on understanding what they
    are up to. I tend to err on the side of making it somewhat better than
    necessary. But more pertinent is that since their very frequent
    tendency is to make substantial changes to the requirements after 50-90%
    of the work has been done, this forces me to try to anticipate these
    changes and to plan hardware that can easily be adapted to a range of
    unforseen modes of operation. This can be very time consuming. But I
    have more experiences where this paid off than ones where I have regrets.

    So yes, this is much easier to do with software which is why I am trying
    to employ uCs and PLDs in everything I do except for the absolutely
    essential analog stuff.
    Well you've provided an interesting insight into the situation. Much
    appreciated.
    Fair enough. When you put it that way I don't think it's that much of a
    disagreement. Looking at the specs of the DT3010:

    http://www.datx.com/images/specs/DT3010Specs.pdf

    indicates that the inputs are spec-ed as 100Mohm, 10pF off/100pF on.
    That doesn't explicitly state what will happen in use, that actually the
    100pF on is not at the same voltage as the 10pF off, which means charge
    injection will occur.

    I would be very disinclined (based on my philosophy) to make a board for
    market like this, without input buffers. I know it would add cost, but
    16-32 extra op amps on a $2000 board would be a fair price to pay for
    the benefit of being able to advertize the product as such and to write
    a white paper about how your signals won't be read correctly when you
    connect a competitor's board to real-world sources without spending a
    several $1000s in time/materials to have an electronics tech. or
    engineer make you an analog buffer amp for each channel.

    I have at least one board which is built more like I would expect, a
    16-channel simultaneous sampling DAQ from Innovative Integrations (a DSP
    solutions vendor). That one has almost exactly the same input
    conditioning on each channel as I am going to be sticking on my DT3010s.

    Oh well,

    Good day!
     
  5. Certainly a fair design criticism. Whether it makes sense for them
    deoends on whether the extra would price them out of their target market
    which may not match your use of course. I know I've been frustrated a
    time or two by the thought "If only they had add/changed this one simple
    thing I could just use X rather than having to .... "

    I've a bit of a soft spot for Data Translation. I did a project with
    one of their boards as a grad student and unlike some other boards I saw
    at the time they had enough information to program them w/o needing pre-
    packaged software.

    I just realized that's 20 years ago. Time flies.
    Hey at least you had the foresight to add room for signal conditioning.
    That gives you bonus points as far as I'm concerned.

    Robert
     
  6. Hi Chris,
    Welcome to the real world. A decade ago, I was on a 'big' project that
    was broken into a Hardware side, and a Software Side. On the hardware
    side there were four of us: Our boss, who know project management (and
    management in general) better than anyone else I have ever seen; An
    engineer who was officially labled as Documentation; another engineer
    who was labled as QA; and me, labled as Senior Hardware Engineer. The
    Software side had at least 15 programmers and system administrators, of
    which 8 of them came and went during the time I was there, not to
    mention the 8 documentation clerks.

    During the nine months I was on the project (it was a virtual company -
    only the managers and about three software guys were actual employees,
    the rest of us were contractors) the hardware group was always on time,
    never had any problems (at least that upper management ever saw) and we
    were on budget. The software group was always 30 to 60 days behind
    schedule, was way over budget, and complained at every meeting to
    management about their troubles.

    I left because they 'fired' the entire hardware group, and spent the
    next year 'persuading' the hardware manager to quit. (They finally had
    to fire him, big severence and all...) They felt that he must have been
    padding his budget and schedule too much since he never had any problems!

    They promoted the software manager...

    Charlie
     
  7. Jim Thompson

    Jim Thompson Guest

    On Mon, 01 Oct 2007 14:51:40 -0700, Charlie Edmondson

    [snip]
    It's the American way ;-)

    I suspect I'm about to be shown the door, I ask too many embarrassing
    questions, and my solutions are so simple they can't understand how
    they work.

    Oh, well. With the Democrats coming the prospective clients are lined
    up from the front door to half-way down the block ;-)

    ...Jim Thompson
     
  8. John Larkin

    John Larkin Guest

    It's a real talent to be able to reduce a problem to its basic,
    simplistic core. In digital design, that means recognizing the minimum
    number of stages and states necessary; in analog design, it's fuzzier
    but still real. I've seen serious industrial and aerospace electronics
    that was 5x or even 10x more complex than necessary. We recently
    replaced a GEC heads-up display driver, for the AC130, with less than
    a tenth of the original hardware, which incidentally had an MTBF of 22
    hours.

    Sometimes people get mad, or refuse to believe, when you show them a
    simple solution.

    John
     
  9. Phil Hobbs

    Phil Hobbs Guest

    Or even one that's too cheap. Figure that out.
    Those are two classical forms of 'thinking inside the box'.

    Cheers,

    Phil Hobbs
     
  10. John Larkin

    John Larkin Guest

    Some peoples' egos are stroked by buying expensive parts!
    Of course, you can overdo (underdo?) it, especially in analog design.
    Say, when two resistors control three important parameters, and you're
    boxed into a bad solution.

    John
     
  11. Phil Hobbs

    Phil Hobbs Guest

    Nah, then you just redesign it so that Herr Ohm or Mr. Faraday or Drs
    Eber and Moll look after the third parameter for you. ;)

    I'm not really advocating going back to 5-tube radios, but I've been
    scarred by my Footprints experience...it's really really hard to get
    people to let you save them money sometimes--even big money.

    Cheers,

    Phil Hobbs
     
  12. Jim Thompson

    Jim Thompson Guest

    :-(

    ...Jim Thompson
     
  13. John Larkin

    John Larkin Guest

    I designed a TTL-based PWM DAC once, that looked fine on paper. It had
    two trimpots, to set gain and offset. Once we built a bunch, we found
    that *nobody*, YHS included, could set the pots. The interaction was
    so bad that the adjustments would rapidly diverge. We had to work out
    a written algorithm that had to be rigorously followed. After a modest
    redesign, the next rev had nearly orthogonal adjustments.

    John
     
  14. You can expect people to be wary as the result of experience. While
    more expensive doesn't necessarily mean better. Cheap often means
    poorly built. If the difference from previous expectations is large the
    obvious question is what has been left out/missed?

    Of course when the design truly is that much better and cheaper it can
    face a bit of a credibilty hurdle.

    Robert
     
  15. Phil Hobbs

    Phil Hobbs Guest

    Been there. I had a *beautiful* measuring interferometer with badly
    interacting controls like that--it worked great but I had to rip the
    covers off it every time I wanted to use it, so I could put a white card
    in the beam inside the box.

    My usual rule of thumb is that 1/10 turn nonorthogonality per turn of
    the other control is OK, 3/10 requires an expert, and any more than that
    is next to impossible. (And this applies in both directions, so putting
    one adjustment on a fine thread doesn't help.)

    Cheers,

    Phil Hobbs
     
  16. Phil Hobbs

    Phil Hobbs Guest

    Understood. What actually happened to me was that I had this 96-pixel
    thermal IR camera with very competitive sensitivity (0.13 K NETD), that
    cost roughly $50 in parts in single quantity--mostly for a
    well-upholstered PIC, some flash memory, and Maxim RS-232 ICs, because
    it was a development version designed by a physicist. A very mildly
    reengineered version would have been $50 total in quantity, and the
    sensor part itself was less than $10.

    The nearest equivalent used a germanium lens and a PZT pyroelectric
    array, and cost $4000 for poorer performance, albeit with 256 pixels.

    In licensing conversations, we asked for $25 per sensor in royalties.
    You'd think that the business folks would be thinking, "Hmm. So my cost
    would go down from $4k * 2000 sensors per year to $75 * 2000 sensors
    per year, which is $150k vs $8M. I can move to Aruba!"

    What we actually got was "That's a *FIFTY PERCENT ROYALTY!* Are you NUTS?"

    Wasn't just once, either. It would have been a much more attractive
    deal to them if it had cost $1000 per sensor and not $50--but we
    couldn't figure out how to make it that expensive.

    Cheers,

    Phil Hobbs
     
  17. Fred Bartoli

    Fred Bartoli Guest

    Le Wed, 03 Oct 2007 02:44:53 -0400, Phil Hobbs a écrit:
    Then propose to build it yourself and sell it for $1000. When they say
    "hey, but it costs next to nothing to you", propose to settle to a $250
    royalties and underline that it's a huge 75% saving for them :)
     
  18. John Larkin

    John Larkin Guest

    Deja vu. I designed a series of electrical measurement/datalogger
    boxes, for utility end-use surveys (which was a big business once) and
    licensed it to an outfit in New Orleans. They sold for 20x what they
    cost to make, and they made tons of bucks, but my 15% royalty still
    grated on them. So instead of working on new products with me, they
    hired an attorney who hired a California design team to do a legally
    defensable "clean room" redesign of the same functionality. They spent
    over a megabuck and never finished the project, the end-use market
    dried up, they had no other products in the pipeline, and the company
    died. All so they could show an "arrogant smart-allec Californian"
    that they were just as smart as him.

    Secrecy, jealousy, NIH, and narrowband greed kill more companies than
    trade secret theft or honest competition. Smart people make alliances
    so everybody can do what they do best.

    Hey, the world still needs an affordable thermal imager. We have a
    Flir and it's hard to believe we ever survived without it. But it cost
    $10K. If they cost $250 or so, you could sell them at Home Depot.

    John
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-