Reputation: 19
#include "target.h"
#include "xcp.h"
#include "LocatedVars.h"
#include "osek.h"
/**
* This task is activated every 10ms.
*/
long OSTICKDURATION;
TASK( Task10ms )
{
void XCP_FN_TYPE Xcp_CmdProcessor( void );
uint32 startTime = GetQueryPerformanceCounter();
/* Trigger DAQ for the 10ms XCP raster. */
if( XCPEVENT_DAQ_OVERLOAD & Xcp_DoDaqForEvent_10msRstr() )
{
++numDaqOverload10ms;
}
/* Update those variables which are modified every 10ms. */
counter16 += slope16;
/* Trigger STIM for the 10ms XCP raster. */
if( enableBypass10ms )
{
if( XCPEVENT_MISSING_DTO & Xcp_DoStimForEvent_10msRstr() )
{
++numMissingDto10ms;
}
}
duration10ms = (uint32)( ( GetQueryPerformanceCounter() - startTime ) / STOPWATCH_TICKS_PER_US );
}
What would be the easiest (and/or best) way to synchronise to some accurate clock to call a function at a specific time interval, with little jitter during normal circumstances, from C++? I am working on WINDOWS operating system now. The above code is for RTAS OSEK but I want to call a function at a specific time interval for windows operating system. Could anyone assist me in c++ language ??
Upvotes: 2
Views: 2702
Reputation: 5194
New applications should use CreateTimerQueueTimer!
Timers in this queue, known as timer-queue timers, are lightweight objects that enable you to specify a callback function to be called when the specified due time arrives. The wait operation is performed by a thread in the thread pool.
CreateTimerQueueTimer function. Example: Using Timer Queues (C++).
However, the granularity is about 1ms, a setup of 10 ms may cause periodic hiccups at 9/11 ms.
For higher resolution you may have to setup a timer wheel using Clock::now()
as described here.
Upvotes: 2
Reputation: 10415
The timeSetEvent API will give you the best stability available, and it can go down to as low as 1 millisecond intervals.
Upvotes: 1