shaftoes
shaftoes

Reputation: 227

char conversion in bitwise operations

I'm implementing a function that will print out a bitmask. I would like to do this on byte at a time. I've come across a strange type conversion problem I can't make sense of.

The following code snippet prints 256

char i = 128;
int j = 256;
printf("%u", (i & j));

Whereas Changing i to int returns 0:

int i = 128;
int j = 256;
printf("%d", (i & j));

What accounts for the strange behaviour of the first?

Upvotes: 1

Views: 51

Answers (2)

Sergey Kalinichenko
Sergey Kalinichenko

Reputation: 727097

It appears that char is signed on your system. That is why 128 gets sign-extended on conversion to int when you perform (i & j)

Therefore, 128 becomes 0xFFFFFF80, so & is performed on 0x00000100 and 0xFFFFFF80:

00000000000000000000000100000000 // 256
11111111111111111111111110000000 // 128, sign-extended

When you perform the same operation on int, no sign extension is performed, so you get zero as expected.

To avoid this behavior, specify that i should be an unsigned char:

unsigned char i = 128;
int j = 256;
printf("%u", (i & j));

This produces zero (demo).

Upvotes: 2

Brian McFarland
Brian McFarland

Reputation: 9442

char i=128 is a signed value, and gets sign-extended to an int when you & it with j.

Upvotes: 1

Related Questions