Reputation: 1075
Here I have a code snippet.
#include <stdio.h>
int main()
{
struct value
{
int bit1 : 1;
int bit2 : 4;
int bit3 : 4;
} bit;
printf("%d",sizeof(bit));
return 0;
}
I'm getting the output as 4 (32 bit compiler). Can anyone explain me how? Why is it not 1+ 4 + 4 = 9? I've never worked with bit fields before so would love some help. Thank you. :)
Upvotes: 3
Views: 2011
Reputation: 22270
When you tell the C compiler this:
int bit1 : 1
It interprets it as, and allocates to it, an integer; but refers to it's first bit as bit1
.
So if we consider your code:
struct value
{
int bit1 : 1;
int bit2 : 4;
int bit3 : 4;
} bit;
What you are telling the compiler is this: Take necessary number of the int
s, and refer to the chunks bit 1 as bit1
, then refer to bits 2 - 5 as bit2
, and then refer to bits 6 - 9 as bit3
.
Since the complete number of bits required are 9, and an int
is 32 bits (in your computer's architecture), memory space of only 1 int
is required. Thus you get the size as 4 (bytes).
Instead, if you were to define the struct
using char
s, since char
is 8 bits, the compiler would allocate the memory space of two char
s for each struct value
. And you will get 2 (bytes) as your output.
Upvotes: 7
Reputation: 145829
Because C requests to pack the bits in the same unit (here one signed int
/ unsigned int
):
(C99, 6.7.2.1p10) "If enough space remains, a bit-field that immediately follows another bit-field in a structure shall be packed into adjacent bits of the same unit"
Upvotes: 5
Reputation: 59987
The processor just likes chucking around 32 bits in one go - not 9, 34 etc.
It just rounds it up to what the processor likes. (Keep the worker happy)
Upvotes: 0