For my internship, I am interested in the maximum streaming rate when using a Delta Sigma ADC to read an input and output it using a VDAC. The actual signal that will be used for the control algorithm will also have to be filtered, which might introduce more delay than I am already experiencing. Moreover, the signal will be around 80 kHz.
During testing, I saw that a sinusoidal signal of 50 kHz already has quite some phase delay with respect to the original signal, when it is outputted to using a VDAC. Is this just the limit of the PSOC5 I am using? Or am I missing some things that might give me better performance? For reference on what I did, see the figures below (yellow is signal generator, blue is VDAC). Also, the project has been attached.
according to the Datasheet, in 16-bit continuous sampling mode, the max sampling rate is 48kHz, but the bandwidth is only 11kHz (~1/4). I suspect that is what you observe. Switch to 8-bit mode and use 384 kHz sampling rate.
Also turn off input buffer to see any effect on delay.