- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I use ADC_SAR but I can't find how can I get the time between samples. Maybe someone can help?
Solved! Go to Solution.
- Labels:
-
PSOC5 LP MCU
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
If you have a scope . . .
In the SAR configuration screen, enable the end-of-scan (EOS) output
Route this signal to a pin.
The output is one clock wide, you can measure the time delay with your scope.
The fastest sample rate is clock rate / 18.
12 MHz clock yields 666666 samples per second, 1.5 usec between samples.
18 MHz clock yields 1000000 samples per second, 1 usec between samples, this is the maximum rate.
Adjust the sample rate by adjusting the clock to the SAR.
---- Dennis
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
That should be easy to determine.
The two fields are in the RED boxes "Resolution (bits):" and "Conversion rate (SPS):"
Basically the more bits you want to use the slower the allowed conversion rate.
Once the resolution is selected you can change the desired conversion rate. If a RED ! shows up next to the field, you will be told the value selected is not valid. Hovering over the RED ! informs you of the valid values for the range.
Once you select the resolution and the desired conversion rate, the "Actual conversion rate (SPS):" in GREEN will be the actual rate. 1/actual rate = time between conversions.
"Engineering is an Art. The Art of Compromise."
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
But I can’t find that variable in code. I mean the green one.
Second things is that, I’ve noticed is when I change the sampling rate, total amount of samples doesn’t change. It’s still the same.
For example I’ve set the conversion rate to minimum, then I sent the data by USB to PC. I put the data to a chart. Then I change the converion rate to the maximum, I did the same step and the result is the same chart. Signal should change because of changing the conversion rate. No samples were missing.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tranzystomator,
You may find basic oscilloscope demo here:
Basic oscilloscope demo using ADC_SAR and KIT-059
The sampling rate can be changed by updating the external clock divider (Clock_SetDivider(new_value);) or, in case of free-running ADC, by the method explained by the Len.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
But I asked how can I get this value in code. But you answer me how can I set this value. 😞
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tranzystomator,
Infineon could have supplied the conversion rate value as a float value in a #define but they chose not.
In general, take the input clock frequency to the SAR and divide by 18. This appears to be the conversion rate calculation.
"Engineering is an Art. The Art of Compromise."
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tranzystomator:
There are two ways (at least) to set the ADC sample rate.
1. Pick a clock rate and adjust the sample delay to get the rate that you want.
2. Set up a timing circuit (typically using a PWM) and use the PWM output to trigger
the external start convert (SOC) input. In this case you use a single shot trigger
I prefer this method because it gives me absolute control of sample rate without having to juggle
around with internal sampling delays.
---- Dennis Seguine, PSoC Applications Engineer
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
As we usually would like to sample ADC in a unformed time,
so setting the sample rate probably provide us the max time for that resolution.
I tried to measure the sampling time using a hard ware time like below
using my CY8CKIT-059.
Schematic
Pins
Tera Term log
I'm not sure how accurate this measurement,
but I hope that at least it gives us some reference.
moto
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
If you have a scope . . .
In the SAR configuration screen, enable the end-of-scan (EOS) output
Route this signal to a pin.
The output is one clock wide, you can measure the time delay with your scope.
The fastest sample rate is clock rate / 18.
12 MHz clock yields 666666 samples per second, 1.5 usec between samples.
18 MHz clock yields 1000000 samples per second, 1 usec between samples, this is the maximum rate.
Adjust the sample rate by adjusting the clock to the SAR.
---- Dennis