Reputation: 40068
for large C++ projects, is it better to compile all cpp files in one big call of g++ or compile them all separately and then link the result or maybe something inbetween (like compiling all files of a namespace/module/subdirectory at once).
What is the actual difference? Which method is fastest and why?
Upvotes: 0
Views: 162
Reputation: 14467
There is a trick. Say you have a bunch of files
file1.cpp
file2.cpp
...
fileN.cpp
Then you can create the "master.cpp" file:
#include "file1.cpp"
#include "file2.cpp"
...
#include "fileN.cpp"
and compile it.
This way you can easily compare build times for individual files and the file bunch.
If you're on UNIX, use the "time" to get the execution time of gcc call.
And, of course, the bottleneck is the "gather" operation - linking stage. There's a GOLD linker ( wikipedia ) which solves this for ELF files.
Upvotes: 1
Reputation: 11256
Compiling files seperately and then linking is the better practice, because it allows you to recompile only affected files in case of a modification. Hence the build time is minimized after the first build.
Actually makefiles are prepared most of the time using this approach, for the said reason.
Upvotes: 4
Reputation: 19286
Obviously, launching multiple compiler processes in parallel will lead to more efficient CPU usage, given that you're on a multi-core or multi-CPU system. The main bottleneck for the whole project is still the linker which actually has to create an executable from all the object files, however.
Other than that, it really depends on the project itself. How many dependencies are there between the files? Which ones need to be compiled first? You're best off letting your build system make those decisions for you.
Upvotes: 0