Reputation: 11
I'm writing a C++ application using asio for asynchronous networking and execution. In my application I want to be able to asynchronously call a function every 50 milliseconds with around one millisecond of error. I came up with this trivial example to illustrate what I am trying to do.
void timer_callback(asio::high_resolution_timer& timer, const std::error_code& error_code, long long t)
{
auto current = std::chrono::duration_cast<std::chrono::milliseconds>(std::chrono::high_resolution_clock::now().time_since_epoch()).count();
auto diff = current - t;
std::cout << diff << '\n';
timer.expires_at(timer.expiry() + std::chrono::milliseconds(50));
timer.async_wait([&timer, current](const std::error_code& error_code) { timer_callback(timer, error_code, current); });
}
int main()
{
asio::io_context io_context;
asio::high_resolution_timer timer(io_context);
timer.expires_from_now(std::chrono::milliseconds(50));
auto current = std::chrono::duration_cast<std::chrono::milliseconds>(std::chrono::high_resolution_clock::now().time_since_epoch()).count();
timer.async_wait([&timer, current](const std::error_code& error_code) { timer_callback(timer, error_code, current); });
io_context.run();
}
I tried running the code on Linux and as expected the program outputted
50
50
50
50
50
50
50
50
50
50
However, when I tried compiling with MSVC and running the program on windows, I got the output
61
48
45
61
47
48
48
48
47
62
One would expect the windows program to also output the same results as the linux program since I am using std::chrono::high_resolution_clock for all of my timings. If someone could help explain these discrepancies, I would appreciate it.
Upvotes: 1
Views: 350
Reputation: 952
Try calling timeBeginPeriod(1)
somewhere near the the start of your program. That modifies the timer resolution to be ~1ms instead of ~15ms.
You can find some detailed discussion of the timer resolution at https://randomascii.wordpress.com/2020/10/04/windows-timer-resolution-the-great-rule-change/
Upvotes: 3