emenegro
emenegro

Reputation: 6971

What is the cost of a #define?

To define constants, what is the more common and correct way? What is the cost, in terms of compilation, linking, etc., of defining constants with #define? It is another way less expensive?

Upvotes: 5

Views: 969

Answers (7)

JohnMcG
JohnMcG

Reputation: 8805

CPU time isn't really the cost of using #define or macros. The "cost" as a developer is as follows:

  • If there is an error in your macro, the compiler will flag it where you referenced the macro, not where you defined it.
  • You will lose type safety and scoping for your macro.
  • Debugging tools will not know the value of the macro.

These things may not burn CPU cycles, but they can burn developer cycles.

For constants, declaring const variables is preferable, and for little type-independent functions, inline functions and templates are preferable.

Upvotes: 1

Alexey Malistov
Alexey Malistov

Reputation: 26975

The best way to define any const is to write

const int m = 7;
const float pi = 3.1415926f;
const char x = 'F';

Using #define is a bad c++ style. It is impossible to hide #define in namespace scope.

Compare

#define pi 3.1415926

with

namespace myscope {
const float pi = 3.1415926f;
}

Second way is obviously better.

Upvotes: 14

cHao
cHao

Reputation: 86506

The compiler itself never sees a #define. The preprocessor expands all macros before they're passed to the compiler. One of the side effects, though, is that the values are repeated...and two identical strings are not necessarily the exact same string. If you say

#define SOME_STRING "Just an example"

it's perfectly legal for the compiler to add a copy of the string to the output file each time it sees the string. A good compiler will probably eliminate duplicate literals, but that's extra work it has to do. If you use a const instead, the compiler doesn't have to worry about that as much.

Upvotes: 5

Mihir Mehta
Mihir Mehta

Reputation: 13833

#define will increase Compilation time but it will faster in execution...

generally in conditional compilation #define is used...

where const is used in general computation of numbers

Choice is depends upon your requirement...

Upvotes: 1

Arun
Arun

Reputation: 20383

#define macros are processed by the pre-processor, they are not visible to the compiler. And since they are not visible to the compiler as a symbol, it is hard to debug something which involves a macro.

The preferred way of defining constants is using the const keyword along with proper type information.

const unsigned int ArraySize = 100;

Even better is

static const unsigned int ArraySize = 100;

when the constant is used only in a single file.

Upvotes: 1

vpit3833
vpit3833

Reputation: 7951

#define is string replacement. So if you make mistakes in the macros, they will show up as errors later on. Mostly incorrect types or incorrect expressions are the common ones.

For conditional compilation, pre-processor macros work best. For other constants which are to be used in computation, const works good.

Upvotes: 1

Terry Mahaffey
Terry Mahaffey

Reputation: 11981

The cost is only to the preprocessor, when #defines are resolved (ignoring the additional debugging cost of dealing with a project full of #defines for constants, of course).

Upvotes: 2

Related Questions