Rrjrjtlokrthjji
Rrjrjtlokrthjji

Reputation: 612

#Define VS Variable

I cannot understand what is the difference between:

#define WIDTH 10 

and

int width = 10;

What are the benefits of using the first or the second?

Upvotes: 11

Views: 29467

Answers (6)

Rafat Khandaker
Rafat Khandaker

Reputation: 1

Define is like a static global definition scope. It is not to be changed or over-written as a normal variable.

Upvotes: 0

Alok Save
Alok Save

Reputation: 206508

What is the difference between two?

The first is an Macro while second is an Variable declaration.

#define WIDTH 10 is a preprocessor directive that allows you to specify a name(WIDTH) and its replacement text(10). The preprocessor parses the source file and each occurrence of the name is replaced by its associated text. The compiler never actually sees a macro name at all, what it sees is the replaced text.

The variable declaration is evaluated by the compiler itself. It tells the compiler to declare a variable named width and of the type int and also initializes it with a value 10.
The compiler knows this variable by its own name width.

Which one should you prefer? And Why?

Usually, it is recommended to use compile time constant variables over #define. So Your variable declaration should be:

const int width = 10;

There are a number of reasons for selecting compile time constants over #define, namely:

Scope Based Mechanism:

The scope of #define is limited to the file in which it is defined. So, #defines which are created in one source file are NOT available in a different source file. In short, #defines don't respect scopes.Note that const variables can be scoped.They obey all scoping rules.


Avoiding Weird magical numbers during compilation errors:

If you are using #define those are replaced by the pre-processor at time of precompilation So if you receive an error during compilation, it will be confusing because the error message wont refer the macro name but the value and it will appear a sudden value, and one would waste lot of time tracking it down in code.


Ease of Debugging:

Also for same reasons mentioned in #2, while debugging #define would provide no help really.

Upvotes: 8

P.P
P.P

Reputation: 121347

WIDTH is a macro which will be replaced with the value (10) by the preprocessor whereas width is a variable.

When you #define a macro (like WIDTH here), the preprocessor will simply do a text-replacement before the program is passed to the compiler. i.e. wherever you used WIDTH in your code, it'll simply be replaced with 10.

But when you do int width=10, the variable is alive

Upvotes: 2

Samy Vilar
Samy Vilar

Reputation: 11100

One #define is handled by the preprocessor, if finds WIDTH in the source code and it replaces it with 10, all it does is basic substitution among other things, the other int width = 10; is handled by the compiler, this will create entries in the look up table, generate binaries to allocate enough memory on the stack, depending where its define, and copy the value 10 to that memory location.

So one is nothing more than a label for a constant, the other is a variable at run time.

You can use preprocessors for faster execution, since variables need to be allocated on the stack, at the cost of not being mutable at run time.

You usually use preprocessors for things that don't need to change at run time, though be careful preprocessors can be a bit tricky to debug, since they can actually manipulate the source code before its handed of to the compiler, leading to very subtle bugs, that may or may not be apparent examining the source code.

Upvotes: 0

user520288
user520288

Reputation:

First a short background: before getting compiled, a C file is pre-processed. The pre-processor checks for #include and #define statements.

In your case, that #define statement tells the pre-processor to change every string WIDTH in your source code with the string 10. When the file gets compiled in the next step, every occurance of WIDTH will be in fact 10. Now, the difference between

#define WIDTH 10 

and

int width = 10;

is that the first one can be seen as a constant value, whereas the second is a normal variable whose value can be changed.

Upvotes: 0

Vlad
Vlad

Reputation: 35584

Well, there is a great difference. You can change the value of width, you can take its address, you can ask for its size and so on. With WIDTH, it will be just replaced with a constant 10 everywhere, so the expression ++WIDTH doesn't make any sense. Ono the other side, you can declare an array with WIDTH items, whereas you cannot declare an array with width items.

Summing it up: the value of WIDTH is known at compile time and cannot be changed. The compiler doesn't allocate memory for WIDTH. On the contrary, width is a variable with initial value 10, its further values are not known at compile time; the variable gets its memory from the compiler.

Upvotes: 14

Related Questions