Patrick Roocks
Patrick Roocks

Reputation: 3259

enums exceeding the size of the largest number type

I want to fully understand how a C++ compiler deals with an enum exceeding the largest possible number, i.e., containing -1 and UINT64_MAX the at the same time, i.e.

enum A {
    X = -1,
    Y = UINT64_MAX
};

First I thought that a compiler won't accept this code. Actually it doesn't compile when enumis replaced by enum class, but the above example compiles. According to the standard we have for the underlying type:

Declares an unscoped enumeration type whose underlying type is not fixed (in this case, the underlying type is an implementation-defined integral type that can represent all enumerator values; this type is not larger than int unless the value of an enumerator cannot fit in an int or unsigned int. If the enumerator-list is empty, the underlying type is as if the enumeration had a single enumerator with value 0). (https://en.cppreference.com/w/cpp/language/enum)

But what means this for my example?

I wrote a small sample program to find out what happens:

#include <iostream>
#include <cstdint>

enum A {
    X = -1,
    XX = -1,
    Y = UINT64_MAX
};

int main()
{

    std::cout << "X unsigned: " << (uint64_t)(X) << ", signed: " << (int64_t)(X) << std::endl;
    std::cout << "Y unsigned: " << (uint64_t)(Y) << ", signed: " << (int64_t)(Y) << std::endl;

    std::cout << "(X == XX) == " << (X == XX) << std::endl;
    std::cout << "(X == Y) == " << (X == Y) << std::endl;
}

The output is:

X unsigned: 18446744073709551615, signed: -1
Y unsigned: 18446744073709551615, signed: -1
(X == XX) == 1
(X == Y) == 0

Now I am quite confused. Obviously, X and Y represent the same number, but they are still distinguishable, i.e., the comparison X == Y is false (but X=XX is actually true). What happens here?

I know, the better way is not use the old enum, but the new enum class. But still enum is widely used and I want to understand what happens here.

Upvotes: 12

Views: 1083

Answers (2)

Duck Dodgers
Duck Dodgers

Reputation: 3461

What you are seeing is an effect of the type-casting and not the enum. Your output depends on how you cast the value.

Try this: It has the same output as yours, without the enums.

#include <iostream>
#include <cstdint>

int main()
{

    std::cout << "X unsigned: " << (uint64_t)(-1) << ", signed: " << (int64_t)(-1) << std::endl;
    return 0;
}

And the output is:

X unsigned: 18446744073709551615, signed: -1

Upvotes: -1

Bathsheba
Bathsheba

Reputation: 234695

Your compiler is most likely using a 128 bit signed integral type as the backing type, in concurrence with the C++ standard.

See for yourself with

std::cout << sizeof(std::underlying_type<A>::type);

Link: https://ideone.com/z4K0rz, outputs 16.

The output you observe is consistent with a narrowing conversion of this to a 64 bit unsigned type.

Upvotes: 10

Related Questions