Reputation: 55
Hi I'm trying to convert DELAY from milliSeconds into microSeconds and store it in a double. But when I use the below code it produces the output 0.000000 On a calculator it works as the desired outcome I'm after if 500 is the DELAY is 0.500000 so I can use it with 'struct timeval'.
#define DELAY 500
double num = (DELAY / 1000);
printf("Num: %lf",num);
Upvotes: 1
Views: 2750
Reputation: 158629
This will use integer division:
double num = (DELAY / 1000);
If you change the 1000
to a floating constant then you will obtain the result you want:
double num = (DELAY / 1000.0);
this works because division will perform the usual arithmetic conversions on it's operands which in this case will cause DELAY to be converted to a double as well.
Note It is probably worth noting that if you want to convert from milliSeconds into microSeconds you want to multiply by 1000
not divide.
Upvotes: 2
Reputation: 60037
It does the division as integers and then converts
Try:
#define DELAY 500f
double num = (DELAY / 1000f);
printf("Num: %lf",num);
Upvotes: 0
Reputation: 25946
This:
(DELAY / 1000);
is integer arithmetic, and evaluates to zero when DELAY
is less than 1000, change to:
(DELAY / 1000.0);
Upvotes: 3