liwen zeng
liwen zeng

Reputation: 89

How to know the right max size of vector? max_size()? but no

When using vector, "Out of Memory" show.
To fix it, I use max_size() to check, then reserve or push_back.
If the max_size() is bigger the reserved value, it should be ok, but it is no! Then what's the meaning of max_size()?
I compile below demo in windows 7 and Visual studio 2010. My PC has 4GB RAM. When the reseverd is 1/2 of max_size(), it fails.

max_size()=2^32/sizeof(CPoint3D)-1=268435455

It's ok when 1/4 of max_size() in the demo. In my real project, It's ok until 1/10.

What's the right of max size of vector, how to enlarge it?


I got the "out of memory" error when I push a lot of elements into std::vector. To avoid the error, I checked with vector::max_size() first, and use vector::reserve() to pre-allocate the memory. However, it doesn't work. In a demo project, the error occurs when I reserve 1/4 of the max_size. In the real project, the error occurs when I reserve 1/10 of it. I'm running Windows 7 and compiling with Visual Studio 2010. My computer has 4GB RAM.

If max_size doesn't work, how do I find out the maximum number of elements I can allocate for a vector?

Upvotes: 2

Views: 1895

Answers (4)

amit kumar
amit kumar

Reputation: 21012

The problem is that vector tries to allocate a contiguous block of memory, which might not be available at that time, even though the total available memory may be much larger.

I would suggest to use std::deque as it does not require to allocate a contiguous block of memory.

Upvotes: 3

stinky472
stinky472

Reputation: 6797

max_size() returns the maximum number of elements a vector can possibly hold. That is, the absolute limit when taking into account things like the addressing limits using the integral types it might store and the address space limits of the operating system.

This doesn't mean you can actually make a vector hold that many elements. It just means you can never store more. Also just because you have 4 gigs of RAM doesn't mean you can actually create a single contiguous buffer that occupies 4 gigs of RAM or anywhere close. There are other factors to consider like memory fragmentation (you might only be able to page a one gig memory block into physical memory due to it).

If you really need this many elements in a container, a contiguous sequence is probably not a good choice. For data sets that large, you may need something that can be paged in bits and pieces like std::deque.

Upvotes: 6

rwong
rwong

Reputation: 6162

vector::capacity() gives the maximum number of elements that can be stored in the vector without having a re-allocation, one which can potentially fail from std::bad_alloc.

vector::max_size() has a different meaning, roughly similar to (INT_MAX / sizeof(element)).

For more information on Windows memory management, see the MSDN article

Upvotes: 3

Jerry Coffin
Jerry Coffin

Reputation: 490018

max_size() tells you the design limit of the class, but memory shortage can limit the real size to something smaller. There's not generally any way to find what the lower limit might be though (e.g., it might change from one moment to another, depending on how much memory is used by other programs).

Upvotes: 4

Related Questions