# Data Acquisition Accuracy & Clock Jitter

Discussion in 'Electronic Design' started by RobertMacy, Sep 16, 2013.

1. ### RobertMacyGuest

Having a bit of difficulty sorting out the effect on signals when there is
a bit of Clock Jitter in grabbing the Sampling point.

It appears that Clock Jitter DESTROYS the signal!

And, it is NOT a simple relationship. For example, low speed signals are
barely affected, but high speed signals...

And when both are present it's another matter. Which means that clock
jitter relates to the time signal when placing its 'stamp' upon your ADC's
performance.

So how does one determine the clock jitter allowable?

And, what is the minimum clock jitter I could expect in a well-designed
200MS/s system? Is there some way to get better?

once the signal is sampled all you have is amplitude so timing
error translates to amplitude error

it is bascially ~
jitter(s) * slewrate(V/s) = error (V)

I'm sure google can find a more formal calculation of what you need
for a given requirement

-Lasse

3. ### Guest

Sample clk jitter has three components to consider: number of digitizing
levels, freq of the highest acquired signal,and the sample clk rate.

does it take to create 1/2 LSB of error.

gl

4. ### RobertMacyGuest

Does that imply that oversampling would improve by the sqrt of the number
of 'over' samples; effectively improving the ADC system?

5. ### Jeroen BellemanGuest

It's easy enough: The error in a sample is proportional to
the product of the timing error and the rate of change of
the input signal.

The allowable jitter depends on the resolution of your
ADC. To give some idea, to digitize a 100MHz sine to
12 effective bits, you need clock jitter to be in the
fractional ps ballpark. Your results suggest that your
clock jitter is very bad indeed.

You get good jitter specs by choosing your sampling clock
source wisely: Quiet stable quartz oscillators, good layout
and decoupling. *Don't* put your clock through FPGAs or
other shared logic. Treat your clock as if it was a sensitive
analog signal.

Jeroen Belleman

6. ### Guest

This application note is apposite:
http://www.linear.com/docs/25374

7. ### RobertMacyGuest

Thank you for Linear's App Note, the appnote has curves starting at 200 fS
rms, I've been trying to live with 500 fS and can almost make the system
work, then the AppNote goes on to mention the LT2209, which has 70 fS rms
aperture jitter!! That'll do it. Less than 100 fS rms aperture jitter is
possible. Since the my application already reduces the jitter by 10:1 [I
think] that makes the 'effective' jitter 10fS, and that almost works!,
thanks.

8. ### RobertMacyGuest

Thank you for the URL, forgot about Silabs as a source. I can live with
more than 10% clock jitter everywhere BUT the data acquisition aperture.
'Rattling' around there causes havoc on the quality of digitizing the
input signal.

9. ### Guest

Well, a "consultant" is paid for what he knows, not what (or who) he
does.