user4973007
user4973007

Reputation:

Clock_gettime() function outputting incorrect time

I am trying to get the runtime of the following code using the clock_gettime function. However when I am running the code I am receiving a time of 0.0000 every time it runs. I have output the start and stop time individually also and I am receiving the exact same answer.

struct timespec start, stop;
double accum;

if( clock_gettime( CLOCK_REALTIME, &start) == -1 ) {
  perror( "clock gettime" );
  exit( EXIT_FAILURE );
}

int src = 1, final_ret = 0;
for (int t = 0; t < rows - 1; t += pyramid_height)
{
    int temp = src;
    src = final_ret;
    final_ret = temp;

    // Calculate this for the kernel argument...
    int arg0 = MIN(pyramid_height, rows-t-1);
    int theHalo = HALO;

    // Set the kernel arguments.
    clSetKernelArg(cl.kernel(kn), 0,  sizeof(cl_int), (void*) &arg0);
    clSetKernelArg(cl.kernel(kn), 1,  sizeof(cl_mem), (void*) &d_gpuWall);
    clSetKernelArg(cl.kernel(kn), 2,  sizeof(cl_mem), (void*) &d_gpuResult[src]);
    clSetKernelArg(cl.kernel(kn), 3,  sizeof(cl_mem), (void*) &d_gpuResult[final_ret]);
    clSetKernelArg(cl.kernel(kn), 4,  sizeof(cl_int), (void*) &cols);
    clSetKernelArg(cl.kernel(kn), 5,  sizeof(cl_int), (void*) &rows);
    clSetKernelArg(cl.kernel(kn), 6,  sizeof(cl_int), (void*) &t);
    clSetKernelArg(cl.kernel(kn), 7,  sizeof(cl_int), (void*) &borderCols);
    clSetKernelArg(cl.kernel(kn), 8,  sizeof(cl_int), (void*) &theHalo);
    clSetKernelArg(cl.kernel(kn), 9,  sizeof(cl_int) * (cl.localSize()), 0);
    clSetKernelArg(cl.kernel(kn), 10, sizeof(cl_int) * (cl.localSize()), 0);
    clSetKernelArg(cl.kernel(kn), 11, sizeof(cl_mem), (void*) &d_outputBuffer);
    cl.launch(kn);
}

if( clock_gettime( CLOCK_REALTIME, &stop) == -1 ) {
  perror( "clock gettime" );
  exit( EXIT_FAILURE );
}
printf( "%lf\n", stop.tv_sec ); 
 printf( "%lf\n", start.tv_sec );  
accum = ( stop.tv_sec - start.tv_sec )
      + ( stop.tv_nsec - start.tv_nsec )
        / BILLION;
printf( "%lf\n", accum );

Any advice on what I'm doing wrong is much appreciated

Upvotes: 1

Views: 471

Answers (1)

Jonathan Wakely
Jonathan Wakely

Reputation: 171283

timespec::tv_nsec is an integer type, so if BILLION is also an integer type then:

( stop.tv_nsec - start.tv_nsec )
    / BILLION;

will truncate to zero. If the tv_sec values are the same you get a zero difference.

Try:

double( stop.tv_nsec - start.tv_nsec )
    / BILLION;

That will perform the division with a double type.

Upvotes: 4

Related Questions