infoclogged
infoclogged

Reputation: 4007

confirm understanding of typedef and #define

There are lots of tutorials and quesitons addressing this. But I want to confirm my understanding in one specific case. The two below should not make a difference to the compiler i.e either one is correct. Right?

typedef _GridLayoutInputRepeater<_num-1,Figure,_types...> _base;

and

#define _base _GridLayoutInputRepeater<_num-1,Figure,_types...> 

Similarly , the below should not make the difference?

#define INT_32 uint32_t

and

typedef uint32_t INT_32;

EDIT : Follow up thread here

Upvotes: 0

Views: 98

Answers (2)

GCUGreyArea
GCUGreyArea

Reputation: 151

The problem with using #define rather than typedef or using is that [as has been pointed out] #define is a macro, and macros are evaluated and expanded by the preprocessor, so the compiler knows nothing about the data type you're trying to create because the #define directive is simply substituted with whatever comes after it.

The reason for using macros in languages such as C and C++ is to allow for things that aren't specifically to do with source code logic but are to do with source code structure.

The #include directive, for instance, quite literally includes the entire content of a file in place of the derective.

So, if myfile.h contains:

void func_1(int t);
void func_2(int t);

then

#inlude "myfile.h" 

would expand the content of myfile.h, replacing the #include preprocessor directive with

void func_1(int t);
void func_2(int t); 

The compiler then comes along and compiles the expanded file with class definitions, and other expanded macros!

It's why the macro

#pragma once 

or

#ifndef  __MYFILE_INCLUDE__
#define __MYFILE_INCLUDE__ 

is used at the start of header files to prevent multiple definitions occurring.

When you use an expression like #define INT64 unsigned int the preprocessor does exactly the same thing. It evaluates the expression, then replaces all occurrences of INT64 with unsigned int.

When you use a typedef, on the other hand, the compiler makes the type substitution, which means the compiler can warn about incorrect use of your newly created type.

#define would simply warn you of an incorrect use of unsigned int which if you have a lot of type substitution can become confusing!

Upvotes: 1

Hatted Rooster
Hatted Rooster

Reputation: 36503

Currently without showing use-cases the 2 situations are both "equal" but what you should note is that #define is a whole different beast than typedef.

typedef introduces an alias for another type, this alias will be seen by the compiler and thus will follow compiler rules, scoping etc.

A #define is a preprocessor macro, the preprocessor will run before the actual compiler and will literally do a textual replacement, it does not care about scoping or any syntax rules, it's quite "dumb".

Usually, typedefs are the way to go as they are so much less error-prone. In which case you could use using = as well but that's personal preference since they're both the same:

using _base = _GridLayoutInputRepeater<_num-1,Figure,_types...>;

Upvotes: 4

Related Questions