Reputation: 8004
C++14 final working draft makes the following comment about std::vector
:
Storage management is handled automatically, though hints can be given to improve efficiency.
cppreference says:
The storage of the vector is handled automatically, being expanded and contracted as needed.
And Wikipedia entry for Dynamic array says:
C++'s
std::vector
and Rust'sstd::vec::Vec
are implementations of dynamic arrays
So I think that a vector should automatically reduce its capacity when its capacity is much larger than its size. I wrote the following code to check my assumption:
#include <iostream>
#include <vector>
using namespace std;
int main() {
vector<int> v = {};
cout << "initialization" << endl;
cout << " capacity: " << v.capacity() << endl;
cout << " size: " << v.size() << endl;
for (int i = 1; i <= 10000; i++)
v.push_back(i);
cout << "after inserting a lot of elements" << endl;
cout << " capacity: " << v.capacity() << endl;
cout << " size: " << v.size() << endl;
v.erase(v.begin() + 1, v.begin() + 10000);
cout << "after erasing a lot of elements" << endl;
cout << " capacity: " << v.capacity() << endl;
cout << " size: " << v.size() << endl;
v.push_back(9);
cout << "after inserting another element" << endl;
cout << " capacity: " << v.capacity() << endl;
cout << " size: " << v.size() << endl;
}
I used g++ -std=c++14 code.cc
to compile the code. And running the resulted a.out
produces the following output. I am using macOS Mojave.
initialization
capacity: 0
size: 0
after inserting a lot of elements
capacity: 16384
size: 10000
after erasing a lot of elements
capacity: 16384
size: 1
after inserting another element
capacity: 16384
size: 2
So it seems that a std::vector
does not reduce its capacity even if its capacity is much larger than its size.
Will a std::vector
ever reduce its capacity?
So are there some conditions to trigger the reduction of its capacity?
Upvotes: 6
Views: 1236
Reputation: 63162
So I think that a vector should reduce its capacity when its capacity is much larger than its size.
Firstly, the standard would have to specify what "capacity is much larger than its size" would mean. That would limit the choices that implementations currently have over the reallocation strategy.
Secondly, if reducing the capacity required re-allocating and moving all the remaining elements. This means that all iterators may be invalidated by an erase, which limits safe usage.
Currently, erase states
Invalidates iterators and references at or after the point of the erase, including the
end()
iterator.
Thirdly, a vector is just as likely to reach it's high watermark of capacity again as it is to remain small for a long time.
You would be making the usage worse for a number of valid scenarios, for the dubious benefit of releasing a large allocation. Modern virtual memory systems deal fine with old allocations sticking around strictly longer than neccecary.
So are there some conditions to trigger the reduction of its capacity?
Yes, shrink_to_fit
is an explicit request to do what you want. If you do desire it to reallocate to a smaller size, you can ask for that. Other usages, which would be harmed by shinks, are unaffected.
Upvotes: 6
Reputation: 24788
If std::vector::shrink_to_fit()
in your implementation doesn't do what you want to achieve (since it just makes a non-binding request for reducing the vector's capacity), you could consider applying the swap-to-fit idiom on your vector v
:
std::vector<int> v;
// ...
{
std::vector<int> tmp(v); // create a copy of vector v
tmp.swap(v);
// vector tmp goes out of scope --> tmp is destroyed
}
Or simply as an one-liner:
std::vector<int>(v).swap(v);
This idiom consists of creating a temporary vector object, which is made of the elements of the original vector v
, and then swapping this resulting temporary with the original vector v
.
Applying this idiom exchanges the capacities of the resulting temporary vector and v
.
This will result in v
having a lower capacity if the resulting temporary has a lower capacity than the original vector v
. This is likely to be the case if v.size()
was much lower than v.capacity()
.
Upvotes: 4
Reputation: 26194
Depends under which operations. If you include all of them, then yes.
See [vector.capacity]p9 (shrink_to_fit
):
Remarks:
shrink_to_fit
is a non-binding request to reducecapacity()
tosize()
. [ Note: The request is non-binding to allow latitude for implementation-specific optimizations. — end note ] ...
and [vector.capacity]p10 (swap
):
Effects: Exchanges the contents and
capacity()
of*this
with that ofx
.
In other words, with shrink_to_fit()
, you may reduce the capacity. With swap()
, as long as you are able to create another vector
with less capacity()
, you are guaranteed to be reducing your capacity()
if you swap()
them (but note that the "original" capacity, i.e. the one which is now that of the other object, is not reduced per se).
Quick note regarding:
So I think that a vector should reduce its capacity when its capacity is much larger than its size. I wrote the following code to check my assumption:
In order to reduce the capacity()
, implementations would need (typically) to allocate memory (and move/copy the elements and deallocate the original storage), which is a very expensive operation that nobody wants when erasing elements.
On top of that, if your program then reaches again its new capacity()
(which is likely, because you already did before), the process has to be repeated again.
Upvotes: 1