Reputation: 591
I'm developing a program which needs to be memory efficient. I have used std::vector
in my program to store a large number of elements. But, I noticed that the program's memory size grows exponentially when the number of elements are chosen large.
For example I wrote the following code:
#include <iostream>
#include <vector>
using namespace std;
int main(){
int vecSize;
cin >> vecSize;
vector<double> a(vecSize);
return 0;
}
Then I monitored the memory consumption using the gnu time command like this:
/usr/bin/time -f "Mem: %M" a.out
This is the memory results I have got for different vector sizes:
VecSize MemUsage
10: 4720 KB
100: 4720 KB
1000: 4736 KB
10000: 5024 KB
100000: 7744 KB
1000000: 35872 KB
10000000: 317120 KB
Does anybody know why the memory usage is growing so much fast when the number of elements are chosen more than 100000?
Upvotes: 2
Views: 707
Reputation: 2741
MSalters was right, and I can't read!
The answer is very simple. To me, the growth seems linear.
You picked VecSize that increase exponentially. You should expect MemUsage to increase exponentially as well! (in the large n limit-- small size optimizations likely exist, as evidenced by the near constant usage for small n...)
I did a linear regression on the data out of curiosity, and the correlation coefficient was 1.0-- indicating that the relation between VecSize and MemUsage (as posted) is (most likely... don't kill me statisticians) linear.
Upvotes: 8
Reputation: 171167
I can't see any exponential growth in the numbers. Size going from 10^5 to 10^6 (10x) increases memory consumption roughly 5x. Going from 10^6 to 10^7 (10x) increaes memory consumption roughly 10x. So it's linear.
The first few numbers (small vector sizes) play little role here - the memory used by the vector is probably dominated by other needs of your program and its runtime. And as soon as you get past that and the vector size starts to dominate, you get roughly linear scaling, so all is well.
Upvotes: 4
Reputation: 179991
The runtime growth has to be exponential, for push_back
to be amortized O(1). If you'd grow one element at a time, growing a 100.000 element vector would take 100.000 element copies.
However, in this case the vector doesn't grow at all. You just initialize a vector with exponentially increasing sizes. Its size is linear in the sire requested. No surprise there.
Upvotes: 4