Reputation: 4856
This might be a very stupid question, so please be gentle. If I run the following code:
#include <climits>
#include <cstdint>
#include <stdio.h>
typedef uint64_t obs;
int main () {
printf("Size : %i\n",sizeof(obs)*CHAR_BIT);
obs value = 1 << 3;
printf("Number: %zu\n",value);
printf("Bits : ");
for (size_t ind = 0; ind < sizeof(obs)*CHAR_BIT; ind++) {
if (value & (1 << ind)) {
printf("%zu ",ind);
}
}
}
For various typedefs I get the following result for 64 bit datatypes (I run on a 64Bit system):
uint64_t / size_t / long unsigned
Size : 64
Number : 8
Bits : 3 35
and the following for other lengths:
uint32_t / uint16_t / uint8_t
Size : 32 / 16 / 8
Number : 8
Bits : 3
If if changing the shift the 64Bit type seems to have a "mirror counterpart" which is shifted by 32Bit. The same holds true when I change the value of value. Is there a reason for this or am I missing something?
using gcc on Win7
Upvotes: 2
Views: 110
Reputation: 172378
The size_t is architecture dependant so on a 32-bit system size_t will likely be at least 32-bits wide. On a 64-bit system it will likely be at least 64-bit wide.
According C99 standard:
The value of the result is implementation-defined, and its type (an unsigned integer type) is size_t, defined in
<stddef.h>
(and other headers)
Also change
if (value & (1 << ind))
to
if (value & (1LL << ind))
as 1 is a 32-bit integer
Upvotes: 2