d0d0
d0d0

Reputation: 160

Lag between GetTickCount() and timeGetTime() changes between executions of my test program

I know that GetTickCount() and timeGetTime() have different resolutions and that the timer resolution of timeGetTime() can be set via calls to timeBeginPeriod().

My understanding is that increasing the timer's resolution using timeBeginPeriod() reduces the system's sleep-time between successive increments to the counter behind timeGetTime().

Let's say the time-resolution of GetTickCount() is 16ms (its value is incremented by 16 every 16ms), and I have set the resolution of timeGetTime() to 1ms (its value is incremented by 1 every 1ms). My question is about the timepoint at which the tick-counter is updated. I wrote a small test-program to see what kind of lag the timer has behind tick-counter at the moment the tick-counter is incremented. With lag I mean the difference GetTickCount() - timeGetTime() right when GetTickCount() updates. E.g. a lag of 0 would mean the tick-counter is updated from 16 to 32 when the function timeGetTime() returns 32, a lag of 4 means the tick-counter is incremented from 16 to 32 when timeGetTime() returns 28. Here's the code:

#include <windows.h>
#include <iostream>
#include <vector>

int main(void) {

    // set time resolution to 1ms
    timeBeginPeriod(1);

    // measure tick counter interval for low resolution
    std::vector<int> difftime(200);
    int lasttick;

    for (int i = 0; i < 200; ++i) {
        lasttick = GetTickCount();
        while (!(GetTickCount()-lasttick)) ;
        difftime[i] = GetTickCount() - timeGetTime();
    }

    // reset timer resolution
    timeEndPeriod(1);

    // printout
    std::cout << "timediff" << std::endl;
    for (int i = 0; i < 200; ++i) {
        std::cout << difftime[i] << std::endl;
    }

    return 0;
}

What surprised me was that while the lag between the two functions is constant during one run of my program, it varies widely between repeated executions of the program. I expected the counters behind those two functions to always run in the background, so I figured the lag should be constant between executions.

At first I thought that increasing the resolution of the timeGetTime()-timer might cause this by introducing some random lag between the two, but when I leave the resolution at 1ms between executions the lag still varies between executions.

Does anybody know what mechanism causes this kind of behavior?

Upvotes: 1

Views: 750

Answers (0)

Related Questions