Viktor Malyi
Viktor Malyi

Reputation: 2386

How GDB deals with big (>1 Gb) debug files?

I have a problem debugging C++ application using remote GDB session that codebase is big, and therefore it contains (when compiled with "-O2', '-g', '-DNDEBUG'" flags) a big file with debug information (1.1 Gb).

Unfortunately, I can't just use partial symbol tables during debugging since all the time debugger skips the part of the application, and I'm unable to set breakpoints there and see the code while debugging.

As the solution for this issue, I execute following command after I've got connected to target:

symbol-file -readnow [path-to-file-with-debugging-info]

This expands full symbol tables. But in this case GDB simply runs out of memory hitting 13 Gb or even more RAM (while I have only 16 Gb available on my machine). This problem is already listed in GDB Wiki and known.

My question is how to deal with GDB in this case, when I need full symbol tables, but GDB requires an enorm amount of memory in order to expand it?

Thanks in advance!

Upvotes: 6

Views: 1480

Answers (2)

Viktor Malyi
Viktor Malyi

Reputation: 2386

Since dealing with big debug files is GDB's weakness, the optimal way in this case was to reduce the size of *.dbg file with help of having debug symbols not for the all application modules, but only for those where debugging will actually occur.

In this case with ~150 mb *.dbg file and using of DS-5 debugger I needed only 2.5 Gb RAM which is acceptable.

Upvotes: 0

ks1322
ks1322

Reputation: 35716

You can try to use gold linker with --compress-debug-sections=zlib option. This will reduce the size of debug info. gdb can read compressed debug info since 7.0 version.

Upvotes: 1

Related Questions