CS Student
CS Student

Reputation: 1633

Symbolic constants in C (#define statement)

After reading through some of K&R's The C Programming Language I came across the #define symbolic constants. I decided to define...

#define INTEGER_EXAMPLE 2
#define CHAR_EXAMPLE 2

...so my question is how does C know if I'm defining an int or a char type?

Upvotes: 5

Views: 2573

Answers (7)

JeremyP
JeremyP

Reputation: 86651

It doesn't; this is the preprocessor. The type of the constant is dependent on the context in which it is used. For instance:

#define INT_EXAMPLE 257

char foo = INT_EXAMPLE;

will attempt to assign 257 in a char context which should generate a warning unless char has more than 8 bits on your computer.

Upvotes: 2

johan d
johan d

Reputation: 2863

The preprocessor cannot know the type of the macro definition. The preprocessor will just replace all occurrences of 'CHAR_EXAMPLE' with '2'. I would use cast:

#define CHAR_EXAMPLE ((char)2)

Upvotes: 1

nos
nos

Reputation: 229088

Regarding these defines, it doesn't. The expanded macros don't have a type. The preprocessor which processes the #define is just replacing text within the source code.

When you use these defines somewhere, e.g.,

int i = INTEGER_EXAMPLE;

This will expand to

int i = 2;

Here the literal 2 (which in this context is an int) is assigned to an int.

You could also do:

char c = INTEGER_EXAMPLE;

Here too, the literal 2 is an int, and it is assigned to a char. 2 is within the limits of a char though, so all is OK.

You could even do:

int INTEGER_EXAMPLE = 2;

This would expand to

int 2 = 2;

Which isn't valid C.

Upvotes: 3

#define-d names have no types. They just define textual replacements.

What the compiler is seeing is the preprocessed form. If using GCC, try gcc -C -E somesource.c and have a look at the (preprocessed) output.

In the 1980s the preprocessor was a separate program.

Read about the cpp preprocessor, and preprocessor and C preprocessor wikipages.

You could even define ill-defined names like

#define BAD @*?$ some crap $?

And even more scary you can define things which are syntactically incomplete like

#define BADTASTE 2 +

and later code BADTASTE 3

Actually, you want to use parenthesis when defining macros. If you have

#define BADPROD(x,y) x*y

then BADPROD(2+3,4+5) is expanded to 2+3*4+5 which the compiler understands like 2+ (3*4) +5; you really want

#define BETTERPROD(x,y) ((x)*(y))

So that BETTERPROD(2+3,4+5) is expanded to ((2+3)*(4+5))

Avoid side-effects in macro arguments, e.g. BETTERPROD(j++,j--)

In general, use macros with care and have them stay simple.

Upvotes: 10

Sadique
Sadique

Reputation: 22823

#Defines are nothing but literal replacements of values. You might want to use

static const

As it respects scope and is type-safe. Try this:

#define main no_main

int main()  // gets replaced as no_main by preprocessor
{
    return 0;
}

Should give you linking errors. Or you could try and fool your teacher by this

#define I_Have_No_Main_Function main //--> Put this in header file 1.h

#include"1.h"

int I_Have_No_Main_Function()
{
    return 0;
}

Upvotes: 2

Nathan Fellman
Nathan Fellman

Reputation: 127428

It doesn't. The #define statements are processed before the compiler starts its work. Basically the pre-processor does a search and replace for what you wrote and replaces it, for instance, all instances of INTEGER_EXAMPLE are replaced with the string 2.

It is up to the compiler to decide the type of that 2 based on where it's used:

int x = INTEGER_EXAMPLE; // 2 is an integer
char y = INTEGER_EXAMPLE; // 2 is a char

Upvotes: 1

RonenKr
RonenKr

Reputation: 213

#define STRING VALUE

is just an instruction for the pre-processor to replace the STRING with VALUE afterwards the compiler will take control and will check the types

Upvotes: 3

Related Questions