Reputation: 7611
I have a program for some scientific simulation stuff, and as such needs to run quickly.
When I started out, I was somewhat lazy, and decided to allow inputting constants later; and just used #define macros for them all.
The problem is that when I tried changing that, it got a lot slower. For example, changing
#define WIDTH 8
//..... code
to
#define WIDTH width
int width;
//... main() {
width=atoi(argv[1]);
//...... code
resulted in something that used to take 2 seconds taking 2.8. That's just for one of about a dozen constants, and I can't really afford that even. Also, there is probably some complied-away math with these.
So my question is if I can have some way (bash script?) of compiling the constants I want to use into the program at runtime. It's ok if any machine that needs to run this has to have a compiler on it. It currently compiles with a standard (quite simple) Makefile.
--This also allows for march=native, which should help a little.
I suppose my question also is if there's a better way of doing it entirely...
Upvotes: 1
Views: 191
Reputation: 90422
The difference is that with the macro being just an integer literal, the compiler is able to often calculate a bunch of the math at compile time. A trivial example is if you had:
int x = WIDTH * 3;
the compiler would actually emit:
int x = 24;
no multiply there. If you change WIDTH to a variable, it can't do that, because it could be any value. So there is almost certainly going to be some difference in speed (how much depends on the circumstance and it is often so little that it doesn't matter).
I recommend making what needs to be variables variables and then profiling to find the hot spots in the code. Almost always, it's the algorithm that slows you down the most. Once you find out which blocks of code you are spending the most time in, then you can figure out ways to make that part faster.
The only real solution would be to have a separate header file with the constants that you could have a script generate then compile the program. Or if there aren't too many just passing them directly to gcc. This of course sacrifices up front speed for runtime speed. I do wonder if a difference of 0.8 seconds in runtime is un-affordable, how is compiling a program (which will surely take more than a second) affordable?
The script could be something as simple as this:
#!/bin/sh
echo "#define WIDTH $1" > constants.h
echo "#define HEIGHT $2" >> constants.h
gcc prog.c -o prog && ./prog
where prog.c includes constants.h or something like this (with no extra header).
#!/bin/sh
gcc -DWIDTH=$1 -DHEIGHT=$2 prog.c -o prog && ./prog
Upvotes: 2
Reputation: 4378
You could store the relevant defines into a separate header file constants.h:
#ifndef CONSTANTS_H
#define CONSTANTS_H
#define WIDTH 8
...other defines...
#endif
If you take care that the header is included only once, then you can even omit the include guards and have a small file with only the relevant stuff. I would go this way if the program is used by others who need to change the constants. If you're the only one using it, then Jerry's method is just fine.
EDIT:
Reading your comment, this separate header could be easily generated with a little tool from the makefile before the compilation.
Upvotes: 1
Reputation: 490108
At least if I understand your question correctly, what I'd probably do would be something like:
#ifndef WIDTH
#define WIDTH 8
#endif
(and likewise for the other constants you want to be able to modify). The in your makefile(s), add some options to the makefile to pass the correct definitions to the compiler when/if necessary, so if you wanted to change the WIDTH, you'd have something like:
cflags=-DWIDTH=12
and when you compile the file, this would be used as the definition for WIDTH, but if you didn't define a value in the makefile, the default in the source file would be used.
Upvotes: 4