Reputation: 1
I'm trying to save as much time as I can in a series of calculations that loop around many many times. There are two cases for the calculation, when a variable S = 1
, and when it does not equal 1. The variable S
is set as a constant with a #define
in the very beginning of the code.
So, right now, I have an if
check the value of S
before doing the appropriate calculation. Would I get any sort of performance increase if I use #if
instead, and have the preprocessor pick out the part of the code that will be used ahead of time, since S
is already available? I don't see any perforemance increase, but I am being told that this is definitely the way to go.
I still find it weird to use preprocessor code in the middle of actual code, but if it's actually going to help, then I don't have a problem with it. Is using #if
in such a case beneficial?
Upvotes: 0
Views: 783
Reputation: 44814
If your compiler is any good at all, you will probably see no difference whatsover. Any decent optimizer will see that you are checking a constant and will not actually produce any code for that if
check.
I am a bit worried that you have folks throwing (most likely BS) optimization requests at you. That way lies the road to Hell. If you folks think performace in there is so critical that its worth investing large amounts of extra programmer time into improving, then the Right Way to do it is to profile the routine. Then any alleged optimization that makes the code harder to read should be reprofiled to prove that performance has improved enough to make it worth it.
Also, in C it can be difficult to avoid, but you should really try to avoid the preprocessor in general. Because its mechanisim of implementation is so much different than normal compiled code, it tends to make your sources way harder to understand. I know that doesn't shave 3 nannoseconds off of your iteration time, but it could save several weeks of maintenence programmer time. That ain't cheap either.
Upvotes: 0
Reputation: 819
The preprocessor will handle the #if directive, meaning that the output code will contain only the single necessary line. This will result in a trivially longer compile time, but the best execution time.
Using the if keyword, on the other hand, puts branch instructions in the output machine code, which take longer to execute than zero branch instructions in assembly.
If you're executing this "if" multiple times, you will save many more cycles overall by having used the #if directive.
As previous answers have mentioned, however, it is very likely the compiler will optimize this for you.
Upvotes: -1
Reputation: 61497
Well, it would eliminate the code that is not needed (as the preprocessor just cuts it out and it isn't compiled to binary), but that would only be beneficial if you are really on the edge about either binary size or cpu cycles (ie. on a microprocessor or embedded device). On a normal desktop application? It wouldn't hurt, but it wouldn't really make any difference either, as it is just one jump statement more in the code.
Upvotes: 0
Reputation: 40264
If you have a constant expression with no side effects inside an if
, I would expect a reasonable compiler to generate code that doesn't bother with the calculation. So I would expect no difference. You can check your compiler's assembly output to confirm this.
The only situation I can think of in which it makes sense to use #if
would be if a certain block of code doesn't make sense at all (or wouldn't compile) for a certain configuration. I would agree that it looks weird for other things. On the other hand, having an if
statement that is either always-true or always-false is also a bit weird to me. This is a subjective call, however.
Upvotes: 5