- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I am using TC377's DSADC to sample Resolver's sin/cos signal, and I enabled the CIC3, FIR0, FIR1, OffsetComp, Integrator.
The initial value of CALFACTOR is configured as 1.0985449.
1. If initial calibration after reset is not used, the sampling result is same after everytime reset, it's Vin * 1.0985449;
2. If initial calibration after reset is used, the sampling result is different after everytime reset, and the CALFACTOR is different, too.
For a given sin wave with Vpp=3.27V, if initial calibration after reset is not used, the sampling result is 3.59V, no matter how many times it's reset.
For the same sin wave, if inital calibration after reset is used, the sampling result is random, sametimes it's 3.05V, and sametimes it's 3.33V.
Best regards
ASUS
Solved! Go to Solution.
- Labels:
-
AURIX
- Tags:
- dsadc
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi ASUS,
Let me understand your question more clearly. How often do you get the the DSADC result? Are you measuring the DC signal or Sin/Cos signal? By interrupt or the polling? Regarding the different result, is it varies each time of the reset or after one reset then result keep changing? Also does it happens only on fmod = 20MHz? Try 16MHz maybe also try different filter chain parameter?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Is there some one can help me? This problem confused me for a long time.
Best Regards
ASUS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
These days I did some test.
1. DSADC and VADC sampled a same DC signal in a same pin at same time. The MOD is 20MHz, and the result of VADC is constly, but the result of DSADC is different every time after reset.
The result of DSADC div CalFactor is constly and closed to the result of VADC.
2. I changed the MOD freq from 20MHz to 40MHz, the problem is disappeared.
Freq (MHz) |
CIC Dec |
CIC Shift |
CorrMul Factor |
FIR0 Dec |
FIR1 Dec |
Intg |
Freq (Hz) |
Grp Delay(us) |
20 |
8 |
9 |
1.0985449 |
2 |
2 |
64 |
9765.625 |
63.2 |
40 |
32 |
15 |
1.0985449 |
2 |
1 |
64 |
9765.625 |
76 |
I can't understand that, according to datasheet, the MOD can work at freq from 16MHz to 40MHz, but why it works not normally at 20MHz?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi ASUS,
Let me understand your question more clearly. How often do you get the the DSADC result? Are you measuring the DC signal or Sin/Cos signal? By interrupt or the polling? Regarding the different result, is it varies each time of the reset or after one reset then result keep changing? Also does it happens only on fmod = 20MHz? Try 16MHz maybe also try different filter chain parameter?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi
I'm having a similar issue, Did you understand what was the root cause ?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, FD
Unfortunately noboby answer me and I still don't know the reason now. I guess it may be relative with the "timing resolution". If "timing resolution" low, the module has low result precision. As a result if the freq is high, the result is accurate.
Best regard,
ASUS