Reputation: 4046
I'm trying to get the bytes from an int into a series of chars in a portable way across all little endian systems.
I have the following code:
#include <stdio.h>
int
main()
{
int i = 0xabcdef12;
printf("i: %x\n", i);
char a, b, c, d;
a = (i >> 000) & 0xFF;
b = (i >> 010) & 0xFF;
c = (i >> 020) & 0xFF;
d = (i >> 030) & 0xFF;
printf("a b c d: %x %x %x %x\n", a , b, c, d);
if(a == 0x12)
printf("a is 0x12\n");
if(b == 0xef)
printf("b is 0xef\n");
if(c == 0xcd)
printf("c is 0xcd\n");
if(d == 0xab)
printf("d is 0xab\n");
if(a == 0xffffff12)
printf("a is 0x12\n");
if(b == 0xffffffef)
printf("b is 0xffffffef\n");
if(c == 0xffffffcd)
printf("c is 0xffffffcd\n");
if(d == 0xffffffab)
printf("d is 0xffffffab\n");
return 0;
}
This piece of code compiles without any warning when using -Wall.
When run this gives:
i: abcdef12
a b c d: 12 ffffffef ffffffcd ffffffab
a is 0x12
b is 0xffffffef
c is 0xffffffcd
d is 0xffffffab
Here are some gcc prints:
Breakpoint 1, main () at test.c:14
14 if(a == 0x12)
(gdb) p/x a
$1 = 0x12
(gdb) p/x b
$2 = 0xef
(gdb) p/x c
$3 = 0xcd
(gdb) p/x d
$4 = 0xab
I'm pretty sure I'm doing something wrong. It would really help me understand what is going on if you could answer a few of the following questions:
& 0xff
bitmasks working?And if anybody has a reliable (system independent, but endianess is not important) way of going from int to char[], that would be great.
Upvotes: 4
Views: 3024
Reputation: 23058
During comparison, a
, b
, c
and d
are promoted as int
. Since they are all char
, the most significant bit is treated as sign bit. The promotion to int
fills MSB to higher bits.
0x12
's MSB is 0, and 0xef
, 0xcd
and 0xab
's MSB is 1. That's why after promotion you get 0x00000012
but 0xffffffef
, 0xffffffcd
and 0xffffffab
.
If you change
char a, b, c, d;
into
unsigned char a, b, c, d;
Then you can get what you expected.
Upvotes: 1
Reputation: 471559
Here's a quick fix.
Change:
char a, b, c, d;
to
unsigned char a, b, c, d;
The reason is that char
is signed on your system. When you pass, a
, b
, c
, d
into printf()
, they get promoted to int
. They are sign-extended. That's why you get all those leading ff
s.
GDB is reporting the correct values because it is reading the chars
directly. (and thus no integer promotion)
Upvotes: 5