HyperionX
HyperionX

Reputation: 1646

C Timer Difference returning 0ms

I'm learning C (and Cygwin) and trying to complete a simple remote execution system for an assignment.

One simple requirement that I'm getting hung up on is: 'Client will report the time taken for the server to respond to each query.'

I've tried searching around and implemented other working solutions but always getting back 0 as a result.

A snippet of what I have:

#include <time.h>

for(;;)
{
    //- Reset loop variables
    bzero(sendline, 1024);
    bzero(recvline, 1024);
    printf("> ");
    fgets(sendline, 1024, stdin);

    //- Handle program 'quit'
    sendline[strcspn(sendline, "\n")] = 0;
    if (strcmp(sendline,"quit") == 0) break;

    //- Process & time command
    clock_t start = clock(), diff;
    write(sock, sendline, strlen(sendline)+1);
    read(sock, recvline, 1024);
    sleep(2);
    diff = clock() - start;
    int msec = diff * 1000 / CLOCKS_PER_SEC;

    printf("%s (%d s / %d ms)\n\n", recvline, msec/1000, msec%1000);
}

I've also tried using a float, and instead of dividing by 1000, multiplying by 10,000 just to see if there is any glint of a value, but always getting back 0. Clearly something must be wrong with how I'm implementing this, but after much reading I can't figure it out.

--Edit--

Printout of values:

clock_t start = clock(), diff;
printf("Start time: %lld\n", (long long) start);
//process stuff
sleep(2);
printf("End time: %lld\n", (long long) clock());
diff = clock() - start;

printf("Diff time: %lld\n", (long long) diff);
printf("Clocks per sec: %d", CLOCKS_PER_SEC);

Result: Start time: 15 End time: 15 Diff time: 0 Clocks per sec: 1000

-- FINAL WORKING CODE --

#include <sys/time.h>

//- Setup clock
struct timeval start, end;

//- Start timer
gettimeofday(&start, NULL);

//- Process command
/* Process stuff */

//- End timer
gettimeofday(&end, NULL);

//- Calculate differnce in microseconds
long int usec =
    (end.tv_sec * 1000000 + end.tv_usec) -
    (start.tv_sec * 1000000 + start.tv_usec);

//- Convert to milliseconds
double msec = (double)usec / 1000;

//- Print result (3 decimal places)
printf("\n%s (%.3fms)\n\n", recvline, msec);

Upvotes: 4

Views: 352

Answers (2)

Medinoc
Medinoc

Reputation: 6608

Cygwin means you're on Windows.

On Windows, the "current time" on an executing thread is only updated every 64th of a second (roughly 16ms), so if clock() is based on it, even if it returns a number of milliseconds, it will never be more precise than 15.6ms.

GetThreadTimes() has the same limitation.

Upvotes: 2

Roddy
Roddy

Reputation: 68033

I think you misunderstand clock() and sleep().

clock measure CPU time used by your program, but sleep will sleep without using any CPU time. Maybe you want to use time() or gettimeofday() instead?

Upvotes: 4

Related Questions