Reputation: 1913
This post does not have an answer to my question.
Consider this:
enum seq {VAL1, VAL2 = 1000000000, VAL3 = UINT_MAX};
int main(void)
{
printf("%lu\n", sizeof(enum seq));
}
Here UINT_MAX
is the max value for a uint32_t
(4 billion or something)
Why is the size of the entire enum
type appears to be only 4 bytes? This is only enough to store a single integer value.
Upvotes: 0
Views: 225
Reputation: 144780
In C, there is no way to get the number of enumeration values in an enum
, nor the maximum or minimum values. Enumerations are just a handy way to define sets of named constant values, but these values are not stored anywhere. sizeof(enum seq)
is the size of the type used by the compiler to represent values the enumerated type, which is implementation specific, but must be able to represent all of the enumeration constants. In your example the compiler seems to use uint32_t
for this type as all constants fit in this type, hence sizeof(enum seq)
evaluates at compile time to 4
.
Note however that the C Standard specifies this:
6.7.2.2 Enumeration specifiers
...
Constraints
The expression that defines the value of an enumeration constant shall be an integer constant expression that has a value representable as anint
.Semantics
The identifiers in an enumerator list are declared as constants that have typeint
...Each enumerated type shall be compatible with
char
, a signed integer type, or an unsigned integer type. The choice of type is implementation-defined, but shall be capable of representing the values of all the members of the enumeration. The enumerated type is incomplete until immediately after the}
that terminates the list of enumerator declarations, and complete thereafter.
Therefore the C Standard does not allow UINT_MAX
as an enumerated value1, but most compilers extend the semantics to handle larger types for enumerations in a compiler specific way. If all values fit in type unsigned int
but not int
, the type of the enumeration could be unsigned int
, but the compiler could also use long
or even long long
.
Also note that you should use %zu
for values of type size_t
such as the value of sizeof(...)
, or cast the value to a specific integer type and use the appropriate conversion specification if your library does not support the C99 z
conversion modifier.
#include <stdio.h>
enum seq { VAL1, VAL2 = 1000000000, VAL3 = UINT_MAX };
int main(void) {
printf("%d\n", (int)sizeof(enum seq)); // may print 4
return 0;
}
1) ignoring the unlikely architectures where INT_MAX == UINT_MAX
.
Upvotes: 3
Reputation: 58142
I think maybe I'm starting to understand your question.
In your example program, the numbers 0
, 1000000000
and UINT_MAX
do not need to be stored in the program's memory at all, since you do not use them. If for example you look at its assembly output you will not see any of those numbers. That is what the comments mean when they say they are stored "nowhere".
If you did use them, they would very likely be encoded directly into an instruction as an immediate, just as if you had used the integer literals 0
or 1000000000
or 4294967295
. See for instance https://godbolt.org/z/6YKeE9. They might also be subjected to constant folding (so you wouldn't encode the number itself, only the result of whatever computation it was used in), or optimized out altogether. But they wouldn't necessarily need to be stored in data memory, unless perhaps you used them to initialize a global or static variable, as here.
And in C, sizeof(type)
always gives you the amount of memory used by an object of that type. So even if you had a compiler that did need to store all of the numbers 0
, 1000000000
and UINT_MAX
in memory somewhere, sizeof(enum seq)
would not give you the total amount of memory needed for that; it would only give you the amount of memory needed to store one object of type enum seq
. Since a 4-byte unsigned integer is big enough to contain any one of the possible values of enum seq
, that's the size you're getting.
Upvotes: 4