Reputation: 480
When I run the following code
#include <stdio.h>
#include <stdlib.h>
int main(){
printf("2/10=%lf 2./10=%lf\n", 2/10, 2./10);
printf("2/10=%lf\n",2/10);
printf("2/10=%d\n",2/10);
printf("2./10=%lf\n",2./10);
return 0;
}
I expected it to return
2/10=0.000000 2./10=0.200000
2/10=0.000000
2/10=0
2./10=0.200000
and instead I get
2/10=0.200000 2./10=0.000000
2/10=0.200000
2/10=0
2./10=0.200000
after thinking a bit I can understand that in the first line the 2/10
could get interpreted as a float division instead of an integer one because I have put the %lf
specifier in the printf()
(is that really what happens?) but what I cannot explain is why the following 2./10
prints 0.000000
instead of 0.200000
as it does in the 4th line.
Anyone can explain it to me?
EDIT:
If I change the code slightly I get
#include <stdio.h>
#include <stdlib.h>
int main(){
printf("2/10=%lf\n",2/10);
printf("2/10=%lf 2./10=%lf\n", 2/10, 2./10);
printf("2/10=%d\n",2/10);
printf("2./10=%lf\n",2./10);
return 0;
}
2/10=0.000000
2/10=0.200000 2./10=0.000000
2/10=0
2./10=0.200000
which seems to support the idea that the 0.2
in the second printf()
statement actually corresponds to the calculation of 2./10
and the 0.0
in the same printf()
statement to some undefined behaviour...
Upvotes: 2
Views: 168
Reputation: 106102
2/10
(integer division) is producing 0
which is an int
type. printing a data with wrong conversion specificationr invokes undefined behavior.
If a conversion specification is invalid, the behavior is undefined.282) If any argument is not the correct type for the corresponding conversion specification, the behavior is undefined.
Upvotes: 4