TreeWater
TreeWater

Reputation: 867

How to make a #define use 64 bits

Suppose I have the following:

#define MAX (16 * 1024 * 1024 * 1024)
#define MIN (1 * 1024 * 1024 * 1024)

This will give MAX = 0. I assume that this is because the define is only using 32 bits for the define. Is there a way to use 64 bits for this or do I need to rework my code so that the define can handle a smaller value?

Upvotes: 4

Views: 1165

Answers (4)

Govind Parmar
Govind Parmar

Reputation: 21552

The macro itself is not sensitive to bitness but the code that uses it may be. If you want to ensure that the constant MAX is always considered a long long, you could define it to be "17179869184‬LL" or "17179869184i64" (MSVC-only) instead.

Better yet:

const long long int MAX = 17179869184‬LL;

Upvotes: 3

Useless
Useless

Reputation: 67743

This will give MAX = 0

No, this will replace MAX with the literal tokens ( 16 * 1024 * 1024 * 1024 ) during the preprocessing phase.

I assume that this is because the define is only using 32 bits for the define

The define isn't using any bits, it's just a text substitution.

Is there a way to use 64 bits for this

Using the type explicitly is perhaps nicer than using the integer literal suffix, because it's more explicit about exactly how many bits you get:

#define MAX ((uint64_t)16 * 1024 * 1024 * 1024)

or

#define MAX (16ll * 1024 * 1024 * 1024)

Upvotes: 6

0___________
0___________

Reputation: 67546

I would rather

#define MAX (16LL * 1024LL * 1024LL * 1024LL)

Upvotes: 0

Carl Norum
Carl Norum

Reputation: 224972

The reason this is happening is that all of those constants are implicitly of type int. In your case, that appears to be a 32-bit type. You need to make sure you're working with a 64-bit type if that's the behaviour you want to have.

You can typecast it to make sure it's a 64-bit type:

#define MAX ((int64_t)16 * 1024 * 1024 * 1024)

Or just expand the math yourself:

#define MAX 17179869184

Upvotes: 3

Related Questions