Reputation:
I have the following code:
#include <stdio.h>
#include <stdlib.h>
int main(){
int first = cdab3412;
int second = abcd1234;
int result1 = (second >> 16) | (first & 0xFFFF0000);
int result2 = (first << 16) | (second & 0x0000FFFF);
printf("Outputs: %x and %x.\n", result1, result2);
result2 turns out as expected an outputs: 34121234
However, result1 outputs ffffabcd. If I just leave it as (first & 0xFFFF0000) it correctly outputs cdab0000.
Why is result1 ffffabcd and not cdababcd?
Upvotes: 2
Views: 1517
Reputation: 374
second
is a signed integer. So when you shift it to the right the leftmost bit becomes a 1.
If you use unsigned int's
you'll get the result you expect.
Signed vs. unsigned has always been a source of confusion, so you need to tread carefully.
Upvotes: 4
Reputation: 891
It's called sign extension. Set the types to unsigned int
and it should work.
Upvotes: 5