Reputation: 47
So I want to verify the hexadecimal representation of the number 1.0 in Language C, below are my codes:
int main(int argc, char **argv)
{
void showBytes(unsigned char * p,int size){
int i;
for (i=0;i<size;i++){
printf("%.2x",p[i]);
}
}
float f;
f=1.0;
showBytes((unsigned char *)&f,sizeof(f));
return 0;
}
When I build and run my program, I get 39300000, which is not the correct representation of 1.0(00003039). Can any one explain to me why I get this incorrect value? Thanks!!!
Upvotes: 0
Views: 305
Reputation: 612964
This is an endianness mis-match. You are displaying the value using one endianness convention, but your expected value uses the other convention.
Endianness is a well-known issue for integer data types, but it is perhaps less well known endianness also affects floating point representation.
Upvotes: 10