potrzebie
potrzebie

Reputation: 1798

"Type" of symbolic constants?

  1. When is it appropriate to include a type conversion in a symbolic constant/macro, like this:

    #define MIN_BUF_SIZE ((size_t) 256)
    

    Is it a good way to make it behave more like a real variable, with type checking?

  2. When is it appropriate to use the L or U (or LL) suffixes:

    #define NBULLETS 8U
    #define SEEK_TO 150L
    

Upvotes: 7

Views: 996

Answers (4)

Carl Norum
Carl Norum

Reputation: 225262

You need to do it any time the default type isn't appropriate. That's it.

Upvotes: 4

Jens Gustedt
Jens Gustedt

Reputation: 78993

Typing a constant can be important at places where the automatic conversions are not applied, in particular functions with variable argument list

printf("my size is %zu\n", MIN_BUF_SIZE);

could easily crash when the width of int and size_t are different and you wouldn't do the cast.

But your macro leaves room for improvement. I'd do that as

#define MIN_BUF_SIZE ((size_t)+256U)

(see the little + sign, there?)

When given like that the macro still can be used in preprocessor expressions (with #if). This is because in the preprocessor the (size_t) evaluates to 0 and thus the result is an unsigned 256 there, too.

Upvotes: 4

Luis
Luis

Reputation: 1235

Explicitly indicating the types in a constant was more relevant in Kernighan and Richie C (before ANSI/Standard C and its function prototypes came along).

Function prototypes like double fabs(double value); now allow the compiler to generate proper type conversions when needed.

You still want to explicitly indicate the constant sizes in some cases. The examples that come to my mind right now are bit masks:

  • #define VALUE_1 ((short) -1) might be 16 bits long while #define VALUE_2 ((char) -1) might be 8. Therefore, given a long x, x & VALUE_1 and x & VALUE_2would give very different results.

    • This would also be the case for the L or LL suffixes: the constants would use different numbers of bits.

Upvotes: 0

Omkant
Omkant

Reputation: 9234

#define is just token pasting preprocessor.

Whatever you write in #define it will replace with the replacement text before compilation.

So either way is correct

#define A a

int main
{
 int A; // A will be replaced by a
}

There are many variations in #define like variadic macro or multiline macro

But the main aim of #define is the only one explained above.

Upvotes: 0

Related Questions