- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear community,
As part of my project, I have to sample a signal (let's call this Vc) at every crossing of another signal(let's call this i_L) with a reference. The i_L signal that is used to trigger the soc of the ADC is a sinusoid with 1V offset and 500 mV amplitude. Moreover, for signal i_L the frequency is not a defining factor and it is 90 degrees in phase ahead of signal Vc.
To see if there is a difference when sampling with the DelSig or the SAR, I have made the following design (do not pay attention to the interrupts, as they are for another purpose):
However, the result I achieve with the SAR is as I intended it, which is seen in:
If I then create a similar project, but with a DelSig ADC the output is not what I expect:
Therefore, I am curious to why the DelSig does not perfom the same as the SAR. What I think is that the DelSig single sample mode receives consecutive "soc" request with a lower sample rate than the SAR, which leads to this output. However, this does not explain the low voltage level on the output pin, which kinda discredits my initial guess. Does someone have a clue on how it is possible that the DelSig is not able to give a similar output as the SAR does? The project has been added to this post.
Many thanks for the help!
Jim
Solved! Go to Solution.
- Labels:
-
PSoC 5LP
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I found the cause of my issues, but how it caused the sampling issues is unknown. Once I changed the output of the VDAC to the 4V range and the clock to 6 MHz, I was able to measure the Vc signal based on the comparator operation of the i_L signal. So, somehow the clock speed of the comparator and the edge detector determines the performance of the DelSig. For the SAR, the clock speed does not influence the performance. I will post some scope plots later.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Well first, between your two projects, the VDACs are configured differently. This accounts for the difference in the amplitude of the signals.
If the issue is the sample rate, you could slow down i_L and both should behave the same. That said I suspect there's something else going on here.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
So I suspect you're getting cross sample contamination here. If you look, the actual samples per second are very similar.
I think you need to set this to multi-sample mode and run StopConvert in an interrupt on EOS of the delsig.
I haven't tested this yet.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I will have access the my setup again tomorrow, so then I will try some stuff. About the frequency of both signals, for now they are not defining, but they need to have the same frequency, as this is required by my project.
Regarding the StopConvert, I would really like to keep the sampling in hardware. In the end, my processor only has to calculate some variables and the control is executed in hardware. If I start adding stuff in software, I am afraid delays will prevent me to get most out of my project. Hence, do you think it is possible to perform a similar task in hardware?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I understand, I was just going by the documentation:
Multi-sample mode captures single samples back to back, resetting itself and the modulator between each sample automatically. This mode is useful when the input is switched between multiple signals. The filters are flushed between each sample so previous samples do not affect the current conversion. Note Take care when switching signals between ADC conversions. Either switch the input quickly between conversions with hardware control or stop the ADC conversion (ADC_StopConvert()) while switching the input. Then restart the ADC conversion (ADC_StartConvert()) after the new signal has been connected to the ADC. Failure to do this may result in contamination between signals in the ADC results.
I guess since you're not really changing signals (just relying on the flushing behavior) you probably don't need to call StopConvert and StartConvert.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Oh wait, I forgot something, you'll need to call StopConvert() to prevent updating your DAC via DMA.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I found the cause of my issues, but how it caused the sampling issues is unknown. Once I changed the output of the VDAC to the 4V range and the clock to 6 MHz, I was able to measure the Vc signal based on the comparator operation of the i_L signal. So, somehow the clock speed of the comparator and the edge detector determines the performance of the DelSig. For the SAR, the clock speed does not influence the performance. I will post some scope plots later.