Reputation: 13
I am using Systick timer to create a delay, the handler of the systick occurs once every microsecond (1 µs).
Additionally, I am using TIM1, it`s handler occurs once every second (1 s). Inside the timer1 handler, I am toggling a LED.
In the main function inside the while loop, I am toggling another LED (a different one than in the timer1 handler), the delay function here uses the Systick.
The timer1 handler is executed as expected, but the problem is that the while loop in the main function is never executed.
Any help?
volatile uint32_t i=0;
void TIM1_UP_TIM10_IRQHandler(void)
{
NVIC_ClearPendingIRQ(TIM1_UP_TIM10_IRQn);
i ^= 1;
if(i == 1)
LED_On(0);
else
LED_Off(0);
TIM1->SR &= ~(1 << 0);
}
int main(void)
{
NVIC_SetPriority(TIM1_UP_TIM10_IRQn, 32);
NVIC_EnableIRQ(TIM1_UP_TIM10_IRQn);
LED_Initialize();
RCC->APB2ENR |= (1 << 0); // Enable Timer1 clock
TIM1->CR1 &= ~(1 << 4); // Set the direction of timer to be upcounter
TIM1->DIER |= (1 << 0); // Enable tim1 interrupt request
TIM1->PSC = 15999; // each count takes 1 msec
TIM1->ARR = 1000; //each 1 sec an overflow update event occurs
TIM1->CR1 |= (1 << 0);
SysTick_Init(16000000/1000000);
while(1)
{
LED_On(1);
Delay_MS(5000);
LED_Off(1);
Delay_MS(5000);
}
return 0;
}
Upvotes: 1
Views: 1377
Reputation: 93446
One microsecond is unreasonably fast for a SYSTICK interrupt;especially if running at just 16MHz. It is likely that your system is spending nearly 100% of its time in the interrupt handler counting ticks.
Even if it could sustain a 1MHz interrupt rate, if Delay_MS(5000)
is a delay of 5000 SYSTICK periods then you will be toggling the LED and 100Hz and will not perceive any flashing, just dimmer illumination.
It seems more likely that you intended one millisecond, which is a more reasonable tick interval:
SysTick_Init( 16000000 / 1000 ) ;
Although I would suggest in fact:
SysTick_Init( SystemCoreClock / 1000 ) ;
so that your code will adapt to changes in the clock rate - since 16MHz is a rather modest rate at which to run an STM32.
It is also possible in any event that your SYSTICK handler and Delay_MS()
implementations are at fault, but it is not possible to guess without sight of that code. If both are provided library code, then that is less likely.
Upvotes: 3
Reputation: 67476
if your clock is 16MHz and you want interrupts to happen every 16 clocks it will now work. In the best cease scenario (code run form RAM etc) you need at least 11 clocks to enter the interrupt handler and 6 clocks to exit it which is more than the time between the interrupts.
Having interrupts every us is every bad idea even if you run faster. 168MHz device will have only 168 clocks between the interrupts. Lets say your handler will run for 20 clocks + 11 + 6 = ~40clcks. It means that the 25% of the processor time will be used to increase the variable!!! Do not do it. On many other (max clock 72 or 80MHz) it will be even worse.
If you want uS delay do something like this (if you reload the counter you need to take it into consideration). The code is only to show the idea
#define TICKSPER_uS 80
void delay_us(uint32_t uS)
{
uint32_t endCnt = SysTick -> VAL + uS * TICKSPER_uS;
while(SysTick -> VAL < endCNT);
}
Upvotes: 3
Reputation: 125
This is my delay fct for usec
//==============================================================================
static uint32_t timeMicroSecDivider = 0;
extern uint32_t uwTick;
//==============================================================================
// The SysTick->LOAD match the uC Speed / 1000.
// If the uC clock is 80MHz, the the LOAD is 80000
// The SysTick->VAL is the decrement counter from (LOAD-1) to 0
//==============================================================================
uint64_t getTimeMicroSec()
{
if ( timeMicroSecDivider == 0)
{
// Number of clock by micro second
timeMicroSecDivider = SysTick->LOAD / 1000;
}
return( (uwTick * 1000) + ((SysTick->LOAD - SysTick->VAL) / timeMicroSecDivider));
}
//==============================================================================
void delayTimeMicroSec(uint32_t delay)
{
uint64_t tickstart = getTimeMicroSec();
while ((getTimeMicroSec() - tickstart) < delay)
;
}
Upvotes: -1