Reputation: 3
I am using
#define printInt(x) printf ("%d",x)
In main()
I can use it like this:
int var=10;
printInt (var);
Which is easier to use than typing
printf ("%d",var);
Will using my own #define
for printing an int
, float
etc make my program slower?
Upvotes: 0
Views: 463
Reputation: 331
No, it doesn't affect the speed of your program.
The #define
instructions are processed by the preprocessor before your program is compiled.
For example the call
printInt(var);
is replaced with
printf ("%d",var);
by the preprocessor.
Therefore the compiler can't determine if a #define
was used or not. In both cases it leads to the same code (and the same program). Thats the reason why it isn't possible that both programs differ in their speed.
EDIT: If you use a lot of #defines
in your program, it is possible that the speed of the proprocessing step decreases. But in most cases this should be no problem.
Upvotes: 3
Reputation: 51
No this will not effect the speed. The macro is expanded during pre-processing so that in every instance that you use printInt(myInt)
what is actually passed to the compiler will be printf("%d", myInt)
. So I think the binary output would be identical either way.
Upvotes: 5