randombits
randombits

Reputation: 48490

Hex 0x0001 vs 0x00000001

often in code that uses permissions checking, i see some folks use hex 0x0001 and others use 0x00000001. these both look like an equivalent of a decimal 1, if i'm not mistaking.

why use one over the other, just a matter of preference?

Upvotes: 7

Views: 24645

Answers (2)

Tamas Czinege
Tamas Czinege

Reputation: 121414

Assuming that this is C, C++, Java, C# or something similar, they are the same. 0x0001 implies a 16-bit value while 0x00000001 implies a 32-bit value, but the real word length is determined by the compiler at compile time when evaluating hexadecimal literals such as these. This is a question of coding style, but it doesn't make any difference in the compiled code.

Upvotes: 13

Joshua
Joshua

Reputation: 43327

What's going on here is this is a bitmask for which it is tradition to place leading zeros out to the width of the bitmask. I would furthermore guess the width of the bitmask changed at some point to add more specialized permissions.

Upvotes: 3

Related Questions