Reputation: 1167
The code that I can't understand does this :
int decodeTimeStampByte(final byte timeByte) {
return timeByte & (~64);
}
So for instance, if I get the byte 4c (which is ASCII L), what exactly would the above function do to it? How about the byte 44?
Upvotes: 1
Views: 344
Reputation: 1
This function is returning the lower 6 bits for positive values and clearing the 7th bit for negative values. So, 2^6=64, 64 = 1000000
in binary, ~64 = 0111111
in binary would mask values between [0..63]
and [-128..-65]
of timeByte
.
Upvotes: 0
Reputation: 1636
This function is returning the lower 6 bits for positive values and clearing the 7th bit for negative values. So, 2^6=64, 64 = 1000000 in binary, ~64 = 0111111 in binary would mask values between [0..63] and [-128..-65] of timeByte.
Upvotes: 1
Reputation: 14467
The '~' is bitwise 'not', so 64 = 0x40 = 0100000b and ~64 = 1011111b (the lower 5 bits set).
Then '&' is bitwise 'and' and it leaves just the 5 lower bits of timeByte. So, basically, it is a truncation of timeByte to 0..63 range.
decodeTimeStampByte(4c) = 0xC (12)
decodeTimeStampByte(44) = 44
P.S. Yes, I forgot the higher bits. ~64 = 1011111b.
It is either a bug in the code or some intention to leave the sign bit (the 7-th bit) in place.
P.P.S. Seems like an ancient bit-hack to squeeze some more performance
Upvotes: 3
Reputation: 13139
This code will clear the bit 6. But if the bit 7 is set, it will set all bits from 8 to 31 (due to casting byte to int)
Upvotes: 2