CL40
CL40

Reputation: 598

Confused on AND bit masking

I am playing around with bit masking and I had thought I understood bitwise math...apparently not.

#include <stdio.h>

int main()
{
    printf("%08x", 0x01111111 & 0xF0F0F0F);

    /* 
     * 0000 0001 0001 0001 0001 0001 0001 0001 
     * 1111 0000 1111 0000 1111 0000 1111 
     * -----------------------------
     * 0000 0000 0001 0000 0001 0000 0001 
     */ 
    
    return 0;
}

Above is a code snippet of a simple bit mask using F0F0F0F to turn off every other byte.

I know that 0111 1111 converted to binary is 0000 0001 0001 0001 0001 0001 0001 0001. If we AND mask this against 1111 0000 1111 0000 1111 0000 1111 I would expected the output to be 0000 0000 0001 0000 0001 0000 0001. However running this program gives a result I didn't expect - 01010101. It would appear the leading 0 in the MSB position is disregarded?

I'm sorry if this is trivial, I'm sure it is. But I am confused by this as I am not sure how this result is given.

Upvotes: 0

Views: 275

Answers (2)

user2233706
user2233706

Reputation: 7207

This is what's happening:

    /* 
     * 0000 0001 0001 0001 0001 0001 0001 0001 
     * 0000 1111 0000 1111 0000 1111 0000 1111 
     * -----------------------------
     * 0000 0001 0000 0001 0000 0001 0000 0001 
     */ 

0xF0F0F0F has 0's at the beginning. That's what 0x means. So for, 0x1, 1 is the least significant bit.

Upvotes: 0

4386427
4386427

Reputation: 44274

0xF0F0F0F is really 0x0F0F0F0F. When you don't type "enough" digits to fill the whole integer, zeros are inserted automatically (e.g. if you just type 0x1, the internal representation is 0x00000001 for 32 bit integers).

So for your code it's

/* 
 * 0000 0001 0001 0001 0001 0001 0001 0001 (binary)
 * 0000 1111 0000 1111 0000 1111 0000 1111 (binary) 
 * ---------------------------------------
 * 0000 0001 0000 0001 0000 0001 0000 0001  (binary)
 */ 

and when printed as hex, you get 01010101

Upvotes: 1

Related Questions