Joel
Joel

Reputation: 2824

Alternatives to using "#define" in C++? Why is it frowned upon?

I have been developing C++ for less than a year, but in that time, I have heard multiple people talk about how horrible #define is. Now, I realize that it is interpreted by the preprocessor instead of the compiler, and thus, cannot be debugged, but is this really that bad?

Here is an example (untested code, but you get the general idea):

#define VERSION "1.2"

#include <string>

class Foo {
  public:
    string getVersion() {return "The current version is "+VERSION;}
};
  1. Why is this this code bad?
  2. Is there an alternative to using #define?

Upvotes: 14

Views: 22176

Answers (5)

EdChum
EdChum

Reputation: 393893

I would not use #define to define a constant use static keyword or better yet const int kMajorVer = 1; const int kMinorVer = 2; OR const std::string kVersion = "1.2";

Herb sutter has an excellent article here detailing why #define is bad and lists some examples where there is really no other way to achieve the same thing: http://www.gotw.ca/gotw/032.htm.

Basically like with many things its fine so long as you use it correctly but it is easy to abuse and macro errors are particularly cryptic and a bugger to debug.

I personally use them for conditional debug code and also variant data representations, which is detailed at the end of the sutter article.

Upvotes: 2

Konrad Rudolph
Konrad Rudolph

Reputation: 545508

The real problem is that defines are handled by a different tool from the rest of the language (the preprocessor). As a consequence, the compiler doesn’t know about it, and cannot help you when something goes wrong – such as reuse of a preprocessor name.

Consider the case of max which is sometimes implemented as a macro. As a consequence, you cannot use the identifier max anywhere in your code. Anywhere. But the compiler won’t tell you. Instead, your code will go horribly wrong and you have no idea why.

Now, with some care this problem can be minimised (if not completely eliminated). But for most uses of #define there are better alternatives anyway so the cost/benefit calculation becomes skewed: slight disadvantage for no benefit whatsoever. Why use a defective feature when it offers no advantage?

So here is a very simple diagram:

  1. Need a constant? Use a constant (not a define)
  2. Need a function? Use a function (not a define)
  3. Need something that cannot be modelled using a constant or a function? Use a define, but do it properly.

Doing it “properly” is an art in itself but there are a few easy guidelines:

  1. Use a unique name. All capitals, always prefixed by a unique library identifier. max? Out. VERSION? Out. Instead, use MY_COOL_LIBRARY_MAX and MY_COOL_LIBRARY_VERSION. For instance, Boost libraries, big users of macros, always use macros starting with BOOST_<LIBRARY_NAME>_.

  2. Beware of evaluation. In effect, a parameter in a macro is just text that is replaced. As a consequence, #define MY_LIB_MULTIPLY(x) x * x is broken: it could be used as MY_LIB_MULTIPLY(2 + 5), resulting in 2 + 5 * 2 + 5. Not what we wanted. To guard against this, always parenhesise all uses of the arguments (unless you know exactly what you’re doing – spoiler: you probably don’t; even experts get this wrong alarmingly often).

    The correct version of this macro would be:

     #define MY_LIB_MULTIPLY(x) ((x) * (x))
    

But there are still plenty of ways of getting macros horribly wrong, and, to reiterate, the compiler won’t help you here.

Upvotes: 11

Andrew Tomazos
Andrew Tomazos

Reputation: 68588

In general the preprocessor is bad because it creates a two pass compilation process that is unsafe, creates difficult to decode error messages and can lead to hard-to-read code. You should not use it if possible:

const char* VERSION = "1.2"

However there are cases where it is impossible to do what you want to do without the preprocessor:

#define Log(x) cout << #x << " = " << (x) << endl;

Upvotes: 1

Kyle
Kyle

Reputation: 1171

#define isn't inherently bad, it's just easy to abuse. For something like a version string it works fine, although a const char* would be better, but many programmers use it for much more than that. Using #define as a typedef for example is silly when, in most cases, a typedef would be better. So there's nothing wrong with #define statements, and some things can't be done without them. They have to be evaluated on a case by case basis. If you can figure out a way to solve a problem without using the preprocessor, you should do it.

Upvotes: 5

Benjamin Lindley
Benjamin Lindley

Reputation: 103693

Why is this this code bad?

Because VERSION can be overwritten and the compiler won't tell you.

Is there an alternative to using #define?

const char * VERSION = "1.2";

or

const std::string VERSION = "1.2";

Upvotes: 15

Related Questions