Setting TIMER period

Tip / Sign in to post questions, reply, level up, and achieve exciting badges. Know more

cross mob
Anonymous
Not applicable

I am using a standard TIMER block to take a number of samples of an AC waveform at set time intervals; I'm basically doing an RMS voltage measurement, so need to be able to ensure I take 32 samples over one half cycle of the incoming AC signal, and the TIMER is used to set the sample period.  Depending upon the frequency of the input AC signal (either 50Hz or 60Hz), I need to change the value of the Period buffer in my timer - a 50Hz cycle requires the samples to be taken slightly further apart when compared to a 60Hz cycle.

In my code, at first power on, I calculate the frequency of the incoming AC signal, then I write an appropriate value (FreqVal) into the Period buffer of my timer using the command Timer_WritePeriod(FreqVal);  This appears to work correctly.

Then, when I need to start my timer, I use the command Timer_Start(); That all seemed to work...or so I thought.

The issue with this, it seems, is that rather than use the value of FreqVal that I wrote into the Period buffer earlier, starting the Timer using the Timer_Start() command appears to grab the default value for the Period buffer from the component I have set up, effectively ignoring the previous Timer_WritePeriod(FreqVal) command.

In reading the datasheet for this component, that is actually what it is supposed to do:

void TCPWM_Start(void)
Description: Initializes the TCPWM with default customizer values when called the first time and enables the TCPWM. For subsequent calls the configuration is left unchanged and the component is simply enabled

But I don't want it to do that. I want it to use the FreqVal value I wrote in their earlier on in my code each time I start it (which I do about every 10 seconds during operation).  But because I start the Timer for the first time after I have written the updated value of FreqVal into the Timer Period, it just defaults to the preset value and my RMS calcs are all wrong!

I did try changing all references to Timer_Start() to Timer_Enable(); as it appears this latter command doesn't use the default Period value from the customizer. However my code started to behave a bit erratically - basically timed out with the Watchdog timer, so I suspect my Timer was never actually triggering an interrupt.  I didn't really look into this much further.

The only way I have been successfully able to get things to work correctly, is if I make a call to Timer_Start() first, then immediately make a call to Timer_Stop(), so that the initial setting up using the default value is completed. Then I make a call to Timer_WritePeriod(FreqVal) to set the Timer Period, and all subsequent Timer_Start() calls will ensure my timer runs using the value of FreqVal for the period.

This seems a bit messy to me, so I was wonder what the correct way to initialise a Timer is, so that whatever value I write into the Period buffer will be the value it uses to set its period.

Cheers,
Mike

0 Likes
1 Solution
Bob_Marlowe
Level 10
Level 10
First like given 50 questions asked 10 questions asked

You found a working solution which is quite often used.

With low frequencies as 60Hz you may start the timer and immediately change the period. The CPU is comparably fast.

Bob

View solution in original post

5 Replies
Bob_Marlowe
Level 10
Level 10
First like given 50 questions asked 10 questions asked

You found a working solution which is quite often used.

With low frequencies as 60Hz you may start the timer and immediately change the period. The CPU is comparably fast.

Bob

Anonymous
Not applicable

OK, thanks Bob.  Seems a weird way of going about it, that's all.  Plus, I'm effectively taking 64 samples over a single cycle of the 60Hz AC waveform, which means my Timer period is 1/(60*64) = 260usec.  So, I've really only got 260usec from the time I start the Timer to when I need the updated value of FreqVal written into the Timer buffer and I'm not sure this is long enough.

0 Likes
Anonymous
Not applicable

Alternatively, you can set the period with Timer_WritePeriod(), then call Timer_Enable() to start the timer.

If you look at the underlying code for Start_Timer() there are only one or two functions, and the Timer_Init() is the function that is rewriting the period, so just skipping that one should work pretty well if you don't want to have the "messy code"

0 Likes
Anonymous
Not applicable

Yeah, as per my original post, I tried using Timer_Enable(), but my code then locked up and invoked a WatchDog restart (presumably got itself in an endless wait loop for some reason).  I didn't want to spend hours trying to understand why, so went back to using Timer_Start(); with the call to Timer_Start(); then Timer_Stop(); prior to calling Timer_WritePeriod(FreqVal);;

0 Likes
Anonymous
Not applicable

Yeah; Unfortunate really

Bob's comment towards changing the period might have been referring to the possibility of changing the period while the timer is running? Since the CPU is fast enough, it might only lost a couple counts on the timer before being changed, but if it is working now, then no reason to break it

0 Likes