Maker Pro
Maker Pro

DAC and ADC Simutaneous Testing

vinit2100

Oct 21, 2013
100
Joined
Oct 21, 2013
Messages
100
I have a DAC and ADC in a controller board. DAC generates a sine wave of 20mV pp ,2.5Khz to a DUT (Device under Test. And DUT after processing the signal generates an amplified version of input signal. The output signal is fed to an ADC which reads the samples of analog output(100 samples).But the problem is i cant function both DAC and ADC simultaneously. Want help in this regard
 

Attachments

  • DAC_ADC.PNG
    DAC_ADC.PNG
    12 KB · Views: 139

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,700
Joined
Nov 17, 2011
Messages
13,700
You should expand on "i cant function both DAC and ADC simultaneously":

- how are you controlling the ADC and the DAC?
- what components do you use (type of DAC and ADC, controller)?
- what exactly means "simultaneously"? Some jitter or time-offste has to be allowed for. What is your max. tolerance for the timing differences between DAC and ADC?

A more detailed circuit diagram could also be helpful.
 

vinit2100

Oct 21, 2013
100
Joined
Oct 21, 2013
Messages
100
I have used DAC for generating a Sine Wave of 100mV amplitude, using a look up table i am feeding DAC input.(I have written a program for that which generates sine wave for say 5seconds. ) This sine wave so generated is fed to DUT, which gives an analog output.This analog signal is read by ADC and i want to get digital values. (I have written program for reading the ADC and tested successfully.) So in short i have a C function that generates sine wave using DAC . and at the same time i want to get the digital values using ADC.
 

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,700
Joined
Nov 17, 2011
Messages
13,700
You're not answering the questions very clearly. However: I assume you're using a microcontroller to output data to the DAC and to read from the ADC. This microcontroller cannot write data and read data at the same time. Physically, because the bus can either be in write or in read mode, not both at the same time. Logically, because the program will run sequentially within the microcontroller.
So you will always have a delay between writing to the DAC and reading from the ADC.

If you really need 0 delay (exactly simultaneous read and write), then you will need additional hardware:
  1. 1 register to write the new DAC value into. This register will store the new value, but not latch it to its output until triggered by a separate signal.
  2. 1 register to read ADC data from
  3. 1 control signal that activates the output of the write register and the input of the read regaister at the same time
Of course you will need other types of DAC and ADC, too, since the ones you have chosen are serial and cannot be operated in this way.

To be able to judge your requirement for "simultaneous" output and input, we'd need tp know more about the DUT. What is the reason for your requirement? What happens, if you have a constant delay between writing to the DAC and reading from the ADC?
 

BobK

Jan 5, 2010
7,682
Joined
Jan 5, 2010
Messages
7,682
I think you are probably asking a simpler question than Harald is answering.

Do you mean that you don't know how to program the writing to the DAC and reading from the ADC as opposed to only being able to do one of those two things? If so, here is how.

Assuming that the ADC can operate at least as fast as you are outputting to the DAC.

This is how you might do it with a program loop:

loop:
send out DAC data
start ADC read
delay
fetch ADC result
goto loop

But far better is to make everything run off a timer interrupt.

The interrupt handler would look like this:

handle_int:
fetch ADC result
send out next DAC data
start ADC read
return

You would have to start the first ADC read before the timer interrupt was started and discard this first sample.

Bob
 

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,700
Joined
Nov 17, 2011
Messages
13,700
Bob, is this my "complicated German way of thinking" again?

With respect to your proposal: The ADC being used can send an interrupt to the MCU when the conversion is finished (SSTRB pin). The interrupt would have to be programmed for detecting the rising edge of SSTRB. If not possible, use an external inverter to generate a falling edge instead.
 

BobK

Jan 5, 2010
7,682
Joined
Jan 5, 2010
Messages
7,682
Harald, it is you your precise German interpretation of simultaneous. I am of half German descent, so I know. :)

Bob
 

vinit2100

Oct 21, 2013
100
Joined
Oct 21, 2013
Messages
100
Sir, Actually i am using a DAC to generate a sine wave, I have a look up table of around 300 values to input DAC, at present it generates a sine wave of 2.5Khz. this is the requirement. And my program is similar to what you have written.. T
* loop:
send out DAC data
start ADC read
delay
fetch ADC result
goto loop *

this works for a Single DAC input. But i am inputing 300 values to dac that creates sine wave. So after ADC read then i will fetch ADC result, and send out DAC data again, So there is a delay between fetch ADC result, and send out DAC data. This delay causes the output frequency to change. So in short i want '0' delay between these two. How will i achieve this ?
 

vinit2100

Oct 21, 2013
100
Joined
Oct 21, 2013
Messages
100
DAC and ADC is working in SPI interface. DAC AD5621 can operate upto 30Mhz and ADC MAX1204 can operate upto 2Mhz. At present the DAC is working in 12Mhz and ADC in 2Mhz I am using Arm 9 processor that is working in 36Mhz. That controls the DAC and ADC working..
 

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,700
Joined
Nov 17, 2011
Messages
13,700
0 delay is not achhievable. Your program runs sequentially, so there will always be a delay between instructions for sending and receiving data.
You should not rely on the rruntime of instructions to define the timing of your project.
What you need is a stable timebase. that triggers the output of new DAC values in regular intervalls. You can use a timer in theµC to do this in a fashion similar like this pseudo code shows:

Code:
main()
{
set timer to 1µs intervall // output one sample every µs, resulting in a 300µs period of the signal. Change time for other frequencies.
while (1);
}

timer interrupt()
{
output next DACC value;
read next ADC value;
}
 

vinit2100

Oct 21, 2013
100
Joined
Oct 21, 2013
Messages
100
You said output sample every 1uS. So the function timer interrupt() takes less than 1uS ?
 

vinit2100

Oct 21, 2013
100
Joined
Oct 21, 2013
Messages
100
so which device should function fastly ADC/DAC ?? ADC MAX 1204 conversion time is maximum 5.5uS. apart from conversion. ADC result(s) is stored in one memory location . so READ ADC takes more time
 

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,700
Joined
Nov 17, 2011
Messages
13,700
So the function timer interrupt() takes less than 1uS ?
That depends on your code, the speed of the µC etc. 1µs is just an example.
You have 2.5kHz and 300 samples/cycle.
One full cycle takes t=1/2.5kHz=400µs
400µs/300samples=1.333µs/sample. This is the actual timeframe for the interrupt.

ADC MAX 1204 conversion time is maximum 5.5uS
Actually no. 5.5µs is the minimum conversion time. Max. time can be up to 10µs (datasheet page 3).

From these numbers you see that you have a problem here: The ADC takes longer to digitize 1 sample (5.5µs) than the update rate of the DAC is (1.33µs). What you can do is to read only every 5th interrupt or use a separate interrupt for reading at the max. rate of the ADC. At 5.5µs sampling time this is equivalent to a sampling rate of 181.818 kilosamples per second
 

vinit2100

Oct 21, 2013
100
Joined
Oct 21, 2013
Messages
100
If i change the update rate of DAC say 10uS, Where should i make the change "the speed of Processor "??
 

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,700
Joined
Nov 17, 2011
Messages
13,700
The update rate is defined (according to my code snippet) by the settings of the interrupt timer.
If you change the update rate of the DAC, you also change the frequency of the output signal.

Why do you need to have the same update rate for DAC and ADC? A sine signal of 2.5kHz needs to be sampled at least with 5kHz sample rate (see nyqist theorem). Assuming a safety factor of 10, your sample rate would be 50kHz. Of course, proper anti-aliasing needs to be provided.

1.333µs/sample output rate for the DAC is equivalent to 750 kHz sampling rate (DAC). Compare this to 50kHz sampling rate for the ADC and you'll see that you need one ADC input sample for every 15 DAC output samples.

Of course you will have to trim these numbers top the exact values in your application.
 

BobK

Jan 5, 2010
7,682
Joined
Jan 5, 2010
Messages
7,682
Sir, Actually i am using a DAC to generate a sine wave, I have a look up table of around 300 values to input DAC, at present it generates a sine wave of 2.5Khz. this is the requirement. And my program is similar to what you have written.. T
* loop:
send out DAC data
start ADC read
delay
fetch ADC result
goto loop *

this works for a Single DAC input. But i am inputing 300 values to dac that creates sine wave. So after ADC read then i will fetch ADC result, and send out DAC data again, So there is a delay between fetch ADC result, and send out DAC data. This delay causes the output frequency to change. So in short i want '0' delay between these two. How will i achieve this ?
It sounds like you are saying the line called "send out DAC data" is sending all 300 values in sequence. Is that the case?

Bob
 

vinit2100

Oct 21, 2013
100
Joined
Oct 21, 2013
Messages
100
Hi i am using AD5621 DAC . And i want to generate a sine wave using this DAC. Please help me in this ? :'(
 
Top