Reputation: 2210
I needed to calculate the time it takes to run a certain function and ran into the following code (source: http://snippets.dzone.com/posts/show/4254 ) with the claim that "...record & output the execution time of a piece of code in microseconds"
/* Put this line at the top of the file: */
#include <sys/time.h>
/* Put this right before the code you want to time: */
struct timeval timer_start, timer_end;
gettimeofday(&timer_start, NULL);
/* Put this right after the code you want to time: */
gettimeofday(&timer_end, NULL);
double timer_spent = timer_end.tv_sec - timer_start.tv_sec + (timer_end.tv_usec - timer_start.tv_usec) / 1000000.0;
printf("Time spent: %.6f\n", timer_spent);
but my personal experience with the piece of code shows that the output 'time' is in seconds instead of microseconds. I need some input on whether I am right or wrong (I need to clarify this once and for all).
Upvotes: 2
Views: 4156
Reputation: 35089
It provides the time difference in seconds and microseconds (with the following term: (timer_end.tv_usec - timer_start.tv_usec)
). So you should be ok :)
Upvotes: 1
Reputation: 7778
Modify like this:
double timer_spent = (timer_end.tv_sec - timer_start.tv_sec)*1e6 + (timer_end.tv_usec - timer_start.tv_usec);
Upvotes: -1
Reputation: 41862
You are right.
The tv_sec
member of the structure stores seconds, and the tv_usec
member (microseconds) is converted to seconds by dividing by 10^6.
Upvotes: 2