Reputation: 1387
What is the point of #define
in C++? I've only seen examples where it's used in place of a "magic number" but I don't see the point in just giving that value to a variable instead.
Upvotes: 83
Views: 152298
Reputation: 91
The true use of #define
should be for controlling the preprocessor stage, to manipulate the source code, before the compiler runs.
Include guards are the most common example:
#ifndef _HEADER_FILE_H
#define _HEADER_FILE_H
...
#endif
The code within the guard is only included in the source code once, through the logic that the preprocessor will only include it when _HEADER_FILE_H
is not defined and, upon including it, this preprocessor variable is defined. So any further attempt to include this code will fail the #ifndef
and the code will be excluded by the preprocessor.
But another use is to conditionally include code, such as only compiling code for a particular compiler or feature (as all compilers define symbols to allow for their detection).
#if !defined __cplusplus
...[code to include when NOT compiling as C++]...
#endif
#if __cplusplus >= 201103L
...[code that's only included if compiling to at least the C++11 standard]...
#endif
#if defined(__clang__)
...[code for Clang compiler]...
#elif defined(__GNUC__) || defined(__GNUG__)
...[code for GCC]...
#elif defined(_MSC_VER)
...[code for MSC]...
#endif
Another benefit of #define
is that preprocessor symbols can be defined on the command line, so you can add, for example, -DDEBUG
to the compiler arguments, which effectively adds a #define DEBUG
before running the preprocessor.
So you can turn on / turn off, for example, debugging directives at the command line (or within your makefile
or whatever).
#ifdef DEBUG
#define Log(msg) fprintf(stderr, msg);
#else
#define Log(msg)
#endif
Using the above code, you can sprinkle Log("entering main loop...");
throughout your code to provide debug logging while it runs.
But it'll only be included in the executable if DEBUG
is defined. Otherwise, the #else
applies and all those Log macros equate to absolutely nothing and vanish.
This is really what #define
is for. It creates a preprocessor symbol and you can use that to control the preprocessor in various ways to manipulate the source code before it even reaches the compiler.
In this context, it's extremely useful and that's what you're supposed to be using it for.
But, yes, one trivial way it can manipulate the source code is just a straight substitution that you can use to create 'magic numbers':
#define ARRAY_SIZE 256
int Array[ARRAY_SIZE];
Which works. But that's not really what it's for. And, yes, a const
probably makes more sense for defining your constants.
#define
is a preprocessor directive and it's meant for the preprocessor stage before compilation. To be able to manipulate the source code BEFORE you send it for compilation.
So I wouldn't say that there's no point to it or compare it to const
, but rather point out that it's supposed to be for a different stage of compilation. Like, #define
is perfect for conditional compilation, const
is probably what you want for defining constants.
It's not really that one is better or worse than the other. It's more that they serve different purposes in different stages of compilation - but, sure, you can do "magic numbers" at any stage. But if that's what you want to do, almost certainly better to use const
for that sort of thing.
Upvotes: 1
Reputation: 423
I got in trouble at work one time. I was accused of using "magic numbers" in array declarations.
Like this:
int Marylyn[256], Ann[1024];
The company policy was to avoid these magic numbers because, it was explained to me, that these numbers were not portable; that they impeded easy maintenance. I argued that when I am reading the code, I want to know exactly how big the array is. I lost the argument and so, on a Friday afternoon I replaced the offending "magic numbers" with #defines, like this:
#define TWO_FIFTY_SIX 256
#define TEN_TWENTY_FOUR 1024
int Marylyn[TWO_FIFTY_SIX], Ann[TEN_TWENTY_FOUR];
On the following Monday afternoon I was called in and accused of having passive defiant tendencies.
Upvotes: 39
Reputation: 5949
The #define
allows you to establish a value in a header that would otherwise compile to size-greater-than-zero. Your headers should not compile to size-greater-than-zero.
// File: MyFile.h
// This header will compile to size-zero.
#define TAX_RATE 0.625
// NO: static const double TAX_RATE = 0.625;
// NO: extern const double TAX_RATE; // WHAT IS THE VALUE?
EDIT: As Neil points out in the comment to this post, the explicit definition-with-value in the header would work for C++, but not C.
Upvotes: 0
Reputation: 3250
The #define
is part of the preprocessor language for C and C++. When they're used in code, the compiler just replaces the #define
statement with what ever you want. For example, if you're sick of writing for (int i=0; i<=10; i++)
all the time, you can do the following:
#define fori10 for (int i=0; i<=10; i++)
// some code...
fori10 {
// do stuff to i
}
If you want something more generic, you can create preprocessor macros:
#define fori(x) for (int i=0; i<=x; i++)
// the x will be replaced by what ever is put into the parenthesis, such as
// 20 here
fori(20) {
// do more stuff to i
}
It's also very useful for conditional compilation (the other major use for #define
) if you only want certain code used in some particular build:
// compile the following if debugging is turned on and defined
#ifdef DEBUG
// some code
#endif
Most compilers will allow you to define a macro from the command line (e.g. g++ -DDEBUG something.cpp
), but you can also just put a define in your code like so:
#define DEBUG
Some resources:
Upvotes: 163
Reputation: 3495
Define is evaluated before compilation by the pre-processor, while variables are referenced at run-time. This means you control how your application is built (not how it runs)
Here are a couple examples that use define which cannot be replaced by a variable:
#define min(i, j) (((i) < (j)) ? (i) : (j))
note this is evaluated by the pre-processor, not during runtime
Upvotes: 1
Reputation:
C didn't use to have consts, so #defines were the only way of providing constant values. Both C and C++ do have them now, so there is no point in using them, except when they are going to be tested with #ifdef/ifndef.
Upvotes: 7
Reputation: 146910
#define
can accomplish some jobs that normal C++ cannot, like guarding headers and other tasks. However, it definitely should not be used as a magic number- a static const should be used instead.
Upvotes: 8
Reputation: 182772
Mostly stylistic these days. When C was young, there was no such thing as a const variable. So if you used a variable instead of a #define
, you had no guarantee that somebody somewhere wouldn't change the value of it, causing havoc throughout your program.
In the old days, FORTRAN passed even constants to subroutines by reference, and it was possible (and headache inducing) to change the value of a constant like '2' to be something different. One time, this happened in a program I was working on, and the only hint we had that something was wrong was we'd get an ABEND (abnormal end) when the program hit the STOP 999
that was supposed to end it normally.
Upvotes: 60
Reputation: 32511
Most common use (other than to declare constants) is an include guard.
Upvotes: 4