Forivin
Forivin

Reputation: 15488

Running code every x seconds, no matter how long execution within loop takes

I'm trying to make an LED blink to the beat of a certain song. The song has exactly 125 bpm.
The code that I wrote seems to work at first, but the longer it runs the bigger the difference in time between the LED flashes and the next beat starts. The LED seems to blink a tiny bit too slow.

I think that happens because lastBlink is kind of depending on the blink which happened right before that to stay in sync, instead of using one static initial value to sync to...

unsigned int bpm = 125;
int flashDuration = 10;
unsigned int lastBlink = 0;
for(;;) {
    if (getTickCount() >= lastBlink+1000/(bpm/60)) {
        lastBlink = getTickCount();
        printf("Blink!\r\n");
        RS232_SendByte(cport_nr, 4); //LED ON
        delay(flashDuration);
        RS232_SendByte(cport_nr, 0); //LED OFF
    }
}

Upvotes: 5

Views: 793

Answers (7)

Galik
Galik

Reputation: 48605

I think the drift problem may be rooted in your using relative time delays by sleeping for a fixed duration rather than sleeping until an absolute point in time. The problem is threads don't always wake up precisely on time due to scheduling issues.

Something like this solution may work for you:

// for readability
using clock = std::chrono::steady_clock;

unsigned int bpm = 125;
int flashDuration = 10;

// time for entire cycle
clock::duration total_wait = std::chrono::milliseconds(1000 * 60 / bpm);

// time for LED off part of cycle
clock::duration off_wait = std::chrono::milliseconds(1000 - flashDuration);

// time for LED on part of cycle
clock::duration on_wait = total_wait - off_wait;

// when is next change ready?
clock::time_point ready = clock::now();

for(;;)
{
    // wait for time to turn light on
    std::this_thread::sleep_until(ready);

    RS232_SendByte(cport_nr, 4); // LED ON

    // reset timer for off
    ready += on_wait;

    // wait for time to turn light off
    std::this_thread::sleep_until(ready);

    RS232_SendByte(cport_nr, 0); // LED OFF

    // reset timer for on
    ready += off_wait;
}

Upvotes: 3

nissefors
nissefors

Reputation: 440

If your problem is drifting out of sync rather than latency I would suggest measuring time from a given start instead of from the last blink.

start = now()
blinks = 0
period = 60 / bpm
while true
    if 0 < ((now() - start) - blinks * period)
        ledon()
        sleep(blinklengh)
        ledoff()
        blinks++

Upvotes: 2

Howard Hinnant
Howard Hinnant

Reputation: 218710

Since you didn't specify C++98/03, I'm assuming at least C++11, and thus <chrono> is available. This so far is consistent with Galik's answer. However I would set it up so as to use <chrono>'s conversion abilities more precisely, and without having to manually enter conversion factors, except to describe "beats / minute", or actually in this answer, the inverse: "minutes / beat".

using namespace std;
using namespace std::chrono;
using mpb = duration<int, ratio_divide<minutes::period, ratio<125>>>;
constexpr auto flashDuration = 10ms;
auto beginBlink = steady_clock::now() + mpb{0};
while (true)
{
    RS232_SendByte(cport_nr, 4); //LED ON
    this_thread::sleep_until(beginBlink + flashDuration);
    RS232_SendByte(cport_nr, 0); //LED OFF
    beginBlink += mpb{1};
    this_thread::sleep_until(beginBlink);
}

The first thing to do is specify the duration of a beat, which is "minutes/125". This is what mpb does. I've used minutes::period as a stand in for 60, just in an attempt to improve readability and reduce the number of magic numbers.

Assuming C++14, I can give flashDuration real units (milliseconds). In C++11 this would need to be spelled with this more verbose syntax:

constexpr auto flashDuration = milliseconds{10};

And then the loop: This is very similar in design to Galik's answer, but here I only increment the time to start the blink once per iteration, and each time, by precisely 60/125 seconds.

By delaying until a specified time_point, as opposed to a specific duration, one ensures that there is no round off accumulation as time progresses. And by working in units which exactly describe your required duration interval, there is also no round off error in terms of computing the start time of the next interval.

No need to traffic in milliseconds. And no need to compute how long one needs to delay. Only the need to symbolically compute the start time of each iteration.

Um...

Sorry to pick on Galik's answer, which I believe is the second best answer next to mine, but it exhibits a bug which my answer not only doesn't have, but is designed to prevent. I didn't notice it until I dug into it with a calculator, and it is subtle enough that testing might miss it.

In Galik's answer:

total_wait =  480ms;  // this is exactly correct
off_wait   =  990ms;  // likely a design flaw
on_wait    = -510ms;  // certainly a mistake

And the total time that an iteration takes is on_wait + off_wait which is 440ms, almost imperceptibly close to total_wait (480ms), making debugging very challenging.

In contrast my answer increments ready (beginBlink) only once, and by exactly 480ms.

My answer is more likely to be right for the simple reason that it delegates more of its computation to the <chrono> library. And in this particular case, that probability paid off.

Avoid manual conversions. Instead let the <chrono> library do them for you. Manual conversions introduce the possibility for error.

Upvotes: 1

Jeremy Friesner
Jeremy Friesner

Reputation: 73051

Busy-waiting is bad, it spins the CPU for no good reason, and under most OS's it will lead to your process being punished -- the OS will notice that it is using up lots of CPU time and dynamically lower its priority so that other, less-greedy programs get first dibs on CPU time. It's much better to sleep until the appointed time(s) instead.

The trick is to dynamically calculate the amount of time to sleep until the next time to blink, based on the current system-clock time. (Simply delaying by a fixed amount of time means you will inevitably drift, since each iteration of your loop takes a non-zero and somewhat indeterminate time to execute).

Example code (tested under MacOS/X, probably also compiles under Linux, but can be adapted for just about any OS with some changes) follows:

#include <stdio.h>
#include <unistd.h>
#include <sys/times.h>

// unit conversion code, just to make the conversion more obvious and self-documenting
static unsigned long long SecondsToMillis(unsigned long secs) {return secs*1000;}
static unsigned long long MillisToMicros(unsigned long ms)    {return ms*1000;}
static unsigned long long NanosToMillis(unsigned long nanos)  {return nanos/1000000;}

// Returns the current absolute time, in milliseconds, based on the appropriate high-resolution clock
static unsigned long long getCurrentTimeMillis()
{
#if defined(USE_POSIX_MONOTONIC_CLOCK)
   // Nicer New-style version using clock_gettime() and the monotonic clock
   struct timespec ts;
   return (clock_gettime(CLOCK_MONOTONIC, &ts) == 0) ? (SecondsToMillis(ts.tv_sec)+NanosToMillis(ts.tv_nsec)) : 0;
#  else
   // old-school POSIX version using times()
   static clock_t _ticksPerSecond = 0;
   if (_ticksPerSecond <= 0) _ticksPerSecond = sysconf(_SC_CLK_TCK);

   struct tms junk; clock_t newTicks = (clock_t) times(&junk);
   return (_ticksPerSecond > 0) ? (SecondsToMillis((unsigned long long)newTicks)/_ticksPerSecond) : 0;
#endif
}

int main(int, char **)
{
   const unsigned int bpm = 125;
   const unsigned int flashDurationMillis = 10;
   const unsigned int millisBetweenBlinks = SecondsToMillis(60)/bpm;
   printf("Milliseconds between blinks:  %u\n", millisBetweenBlinks);

   unsigned long long nextBlinkTimeMillis = getCurrentTimeMillis();
   for(;;) {
       long long millisToSleepFor = nextBlinkTimeMillis - getCurrentTimeMillis();
       if (millisToSleepFor > 0) usleep(MillisToMicros(millisToSleepFor));

       printf("Blink!\r\n");
       //RS232_SendByte(cport_nr, 4); //LED ON
       usleep(MillisToMicros(flashDurationMillis));
       //RS232_SendByte(cport_nr, 0); //LED OFF
       nextBlinkTimeMillis += millisBetweenBlinks;
   }
}

Upvotes: 3

Aslak Berby
Aslak Berby

Reputation: 183

Add value to lastBlink, not reread it as the getTickCount might have skipped more than the exact beats want to wait.

lastblink+=1000/(bpm/60);

Upvotes: 3

patros
patros

Reputation: 7819

The most obvious issue is that you're losing precision when you divide bpm/60. This always yields an integer (2) instead of 2.08333333...

Calling getTickCount() twice could also lead to some drift.

Upvotes: 0

shakeit
shakeit

Reputation: 1

You should count the time spent on the process and substract it to the flashDuration value.

Upvotes: 0

Related Questions