xyz
xyz

Reputation: 8917

Some basic questions about debugging core files C++/linux?

Debugging files of C++ applications/linux have always been a mystery to me and some basic understanding is lacking.

(1) Do we need to necessarily compile applications with -g flag without which core files are unable to give any useful information whatsoever? But I see that even when we don't compile with -g flag, core files are generated -- so they must be serving some purpose apart from occupying space on disk.

Wikipedia says : "In computing, a core dump, memory dump, or storage dump consists of the recorded state of the working memory of a computer program at a specific time, generally when the program has terminated abnormally (crashed)".

This should mean that irrespective of if we compiled with -g flag, we still have state. and if we have stack track, we should still be able to know what function caused the error.

Upvotes: 0

Views: 349

Answers (2)

Ribtoks
Ribtoks

Reputation: 6922

so they must be serving some purpose apart from occupying space on disk

You can limit size of core files with ulimit -c $limit command and your core files won't occupy your disk space.

And, as Joachim already said -g option just includes debug symbols and checks to your program.

Upvotes: 1

Some programmer dude
Some programmer dude

Reputation: 409206

The -g option has nothing to do with the core files, but with putting debug information in the program. That is, the generated executable file will contain all symbols (e.g. function and variable names) as well as line number information (so you can find out which line a crash occurs in).

The actual core dump only contains a memory dump. Yes you can, together with the program, get a stack trace, but unless the program has debug information you can not see function names or line numbers, only their addresses.

Upvotes: 2

Related Questions