clemej
clemej

Reputation: 2563

Space efficient C++ vector allocator for large vectors?

I'm working with some C++ code that implements a graph algorithm that uses a lot of small chunks of memory (a relative of gSpan, but that doesn't matter). The code is implemented in C++ and uses std::vectors to store many small elements (on the order of 64 bytes each). However, I'm using this on much larger data sets than the original authors, and I'm running out of memory.

It appears, however, that I'm running out of memory prematurely. Fragmentation? I suspect it is because std::vectors are trying to increase in size every time they need more memory, and vectors insist on contiguous memory. I have 8GB of ram and 18GB of swap, yet when std::bad_alloc is thrown, I'm only using 6.5GB resident and ~8GB virtual. I've caught the bad_alloc calls and printed out the vector sizes and here's what I see:

size: 536870912
capacity: 536870912
maxsize: 1152921504606846975
terminate called after throwing an instance of 'std::bad_alloc'
    what():  std::bad_alloc

So, clearly, we've hit the maximum size of the vector and the library is trying to allocate more, and failing.

So my questions are:

Since I don't know how much memory will ultimately used, I'm aware that even if I make changes there still might not be enough memory to do my calculations, but I suspect I can get at least a lot further then I'm getting now, which seems to be giving up very quickly.

Upvotes: 2

Views: 1480

Answers (1)

Mark B
Mark B

Reputation: 96233

I would try using std::deque as a direct drop-in for vector. There's a possibility that since it (often) uses a collection of chunks, extending the deque could be much cheaper than extending a vector (in terms of extra memory needed).

Upvotes: 6

Related Questions