Reputation: 22854
Could someone please describe, what happens beneath the linking stage of lib_xxxx
libraries under *nix-like
systems when using ./configure
and make
commands?
Actually, I have a concrete question:
What happens behind the linking stage in makefiles compared to simply archiving the required .o
object files?
So, I'm pretty sure that, for example, any C
library can be combined together using command like: ar r libXYZ.a *.o
from the object folder.
But I suspect, that this is not what is actually done (because the makefiles use libtool
, etc, ...).
What's the purpose of that and how is it done?
What about the linking stage when compiling C++
code libraries (in a simple case, for example, when no cross-object optimization is required)?
I suspect you also can put the resulting object files into an archive and use it as a static library.
So what's actually hidden and what's the purpose of that?
Thank you.
Upvotes: 1
Views: 861
Reputation: 16315
So, I'm pretty sure that, for example, any C library can be combined together
using command like: ar r libXYZ.a *.o from the object folder.
But I suspect, that this is not what is actually done
(because the makefiles use libtool, etc, ...).
What's the purpose of that and how is it done?
The purpose of libtool is to hide the complexity of invoking the system's actual library tools (e.g., ar, nm, ld, lipo, etc.) for the purposes of making the package more portable across POSIX-like OSes. The ./configure script will output the libtool script during its execution. The package maintainer doesn't have to use libtool (and rely on manually invoking the native tools with the correct flags), but it does make the job easier.
As you suspected, the command line tools are used to build libraries, but are hidden behind libtool. You can find out more about how this is accomplished by examining one of the .la library files output by libtool, but this is only part of the answer. The rest of the answer depends on the platform where the libraries are to be used as libraries can be cross compiled.
What happens behind the linking stage in makefiles compared to simply archiving
the required .o object files?
Again, the answer depends on the platform where the libraries are to be used. For static libraries, archiving them with ar is usually enough. For dynamic libraries, the answer varies by system. The libtool script has this information built into it so the Right Thing happens when make install is invoked.
Upvotes: 0
Reputation:
ar
only gets used when creating static libraries, which are basically just a blob of object files (and, as such, are only usable as input to ld
-- they can't be loaded at runtime). Creating a dynamic library is a completely separate process which ld
has to get involved in.
On most UNIXy systems besides OS X, libtool
is just a front-end to the compiler and linker. Most of what goes on behind the scenes is just (g)cc
and ld
.
(On OS X, libtool
is a separate utility which gets used to create both .a
static libraries and .dylib
shared library files.)
Upvotes: 0
Reputation: 477630
There are many different steps in what you just said:
configure
is a shell script that sets up the build environment
make
is a tool that invokes various instances of the compiler (and many other things) depending on file dependencies (whether dependent files are newer than target files).
The actual "compiler" invocation is perhaps what you're most interested in: Preproccessing, compiling, assembling, you seem to be happy with those.
Finally, linking: All the object files have to be turned into an executable: The linker will look for one entry point (typically main()
). All symbols (i.e. functions and global variables) that appear have to be filled in with the address of actual code that's supplied in other object files. Once all the symbols from your local object files have been used, you may still have "undefined symbols" which need to be fed from libraries.
So the result of a complete linking typically is a binary where all the names of functions (may) have been removed and replaced by the actual addresses (never mind PIC) of code or references to load-time shared libraries. Essentially, your initial collection of objects is no longer visible in the linked binary. (Modern link-time optimization may in fact mix up and prune your code very severely.) On the other hand, a static library created with ar
is just a collection of raw, unlinked objects with all their symbols intact which may be used for linking.
Hm, this was a very hectic overview of a vast subject, so naturally this is nowhere near complete or representative, and probably only partially correct. Post a comment if you have a specific concern.
Upvotes: 0
Reputation: 283911
Not all ar
utilities are created equal. libtool
hides different options, etc., required by different toolchains, and also supports creation of both shared and static libraries.
Upvotes: 3