Reputation: 580
I am trying to simulate a point falling at a constant rate in linux. For this to work, I need to get the time to millisecond resolution. Now this part is fine, but I am having a problem with clock_gettime.
When the 'tv_nsec' field wraps around to about 100000000, it starts back near zero, and the time retrieved by clock_gettime is is before the time retrieved the previous iteration. Note that this doesn't happen every time the field wraps, but it does happen.
To debug, I made it write the values returned from clock_gettime and the delta value retrived:
Iteration:
gettime.seconds: 1362720808 , gettime.us: 993649771, total: 1362721801649 us
delta: 0.014Another iteration:
gettime.seconds: 1362720808 , gettime.us: 993667981, total: 1362721801667 us
delta: 0.015Another iteration:
gettime.seconds: 1362720808 , gettime.us: 993686119, total: 1362721801686 us
delta: 0.015Iteration in question:
gettime.seconds: 1362720809 , gettime.us: 20032630, total: 1362720829032 us
delta: -972.661
Note that the delta is in seconds, which is calculated by dividing the milliseconds by 1000, which combined with subtracting a time from the future from a time from the past which equals a negative, and then dividing that by 1000, it makes the delta a positive.
The code to reproduce the problem is here:
#include <iostream>
#include <sys/time.h>
using namespace std
double prevMillis = 0.0;
double getMillis()
{
timespec ts;
clock_gettime(CLOCK_REALTIME, &ts);
cout << "gettime.seconds: " << ts.tv_sec << " , gettime.us: " << ts.tv_nsec << ", total: " << ((ts.tv_sec * 1000) + (ts.tv_nsec / 1000)) << " ms" << endl;
return ((ts.tv_sec * 1000) + (ts.tv_nsec / 1000)) + 0.5;
}
int main()
{
double delta = 0.0;
prevMillis = getMillis();
while(delta >= 0)
{
delta = (getMillis() - prevMillis) / 1000;
prevMillis = getMillis();
cout << "Delta: " << delta << endl << endl;
}
return 0;
}
Note that it must be compiled with '-lrt' for the the clock functions.
This will loop until the problem occurs, i.e. the delta is negative because of the time. It only takes a few seconds on my PC.
Sorry about the verbose question, but thanks for any help I may get in advance :)
Upvotes: 1
Views: 3156
Reputation: 14622
tv_nsec
is nanoseconds, i.e. 1 billionth (1 / 1,000,000,000) of one second. Your calcuation however is treating it as if it's microseconds.
Here's the fix:
return ((ts.tv_sec * 1000) + (ts.tv_nsec / 1000000)) + 0.5;
^^^
Upvotes: 4