Reputation: 24140
I have the following C99 program which measures performance of simple division operations relative to addition. However, the difftime
function keeps returning 0 even though the program is clearly taking several seconds to process runAddition
and runDivision
with iterations
set to 1 billion.
#include <stdio.h>
#include <time.h>
void runAddition(long long iterations)
{
long long temp;
for (long long i = 1; i <= iterations; i++)
{
temp = temp + i;
}
}
void runDivision(long long iterations)
{
long long temp;
// Start at 1 to avoid division by 0!
for (long long i = 1; i <= iterations; i++)
{
temp = temp / i;
}
}
int main()
{
long long iterations = 1000000000;
time_t startTime;
printf("How many iterations would you like to run of each operation? ");
scanf("%d", &iterations);
printf("Running %d additions...\n", iterations);
startTime = time(NULL);
runAddition(iterations);
printf("%d additions took %f seconds\n", iterations, difftime(time(NULL), startTime));
printf("Running %d divisions...\n", iterations);
startTime = time(NULL);
runDivision(iterations);
printf("%d divisions took %f seconds\n", iterations, difftime(time(NULL), startTime));
}
Upvotes: 5
Views: 7589
Reputation: 37427
Your format string expects an int
(%d
), and a double
(%f
). Your arguments are long long
and double
. You should set the first format string as %lld
.
When pushing arguments on the stack to call printf
, you push a long long
using 8 bytes, and a double
using 8 bytes too. When the function printf
reads the format string, it expects first an int
on 4 bytes, and a double
on 8 bytes. printf
gets the int
correctly as you are little-endian and the first four bytes of your long long
are enough to represent the value. printf
then gets the double
for which it gets the last four bytes of the long long
, followed by the first four bytes of the double
. As the last four bytes of the long long
are zeroes, what printf
thinks is a double starts with four bytes with value zero, resulting in a very very tiny value for the double
according to the binary representation of doubles.
Upvotes: 7
Reputation: 454960
Try using %lld
in place of %d
in the printf
:
printf("%lld additions t
^^^
Upvotes: 2
Reputation: 117220
Make temp volatile
so it does not get optimized away. The compiler is probably seeing it as a section/function with no side effects.
Upvotes: 1
Reputation:
time() returns a time_t, which has a resolution of one second.
The time required for runDivision() is less than one second; one billion operations on a multi-GHz core will take less than a second.
Upvotes: 0
Reputation: 3431
It calculates the difference in seconds between time1 and time2. So maybe your time difference is less than 1 second?
Output your start and end time to verify.
Upvotes: 0