PEC
PEC

Reputation: 603

C++ Containers: Optimal Memory Management

I want to implement a container. The data will be stored in a dynamically allocated array. I need advice on memory reallocation.

Basically, i want a formula on how much bigger i should make the array when it is full. I think a constant value would be sub-optimal, since the larger the array is, the longer it takes to copy it.

For example, if an array can store 1000000 doubles and it becomes full, reallocating for 1000005 doubles would be stupid. Going for 1001000 would be a better idea. On the contrary, if i have an array of 5 doubles and it gets full, enlarging it to 1005 units is equally stupid. Maybe enlarging it by 10% (or like by 20+10% so that it feels ok on small arrays too) every time would be a better idea. Any advice on this?

Upvotes: 0

Views: 156

Answers (1)

Cort Ammon
Cort Ammon

Reputation: 10863

I would start by reusing std::vector. Don't re-implement what already works well.

If you know something about the size of your data, then use the reserve() function to ensure you don't allocate more often than you need to. Feel free to reserve 20%, or 40% extra space if you dont know exactly how much data you have.

If you don't know something about the size of your data, then std::vector is optimized for good performance without knowing anything. If you know nothing about the size of your data, you are equally likely to have 10001 entries and have vector wastefully allocate lots of space as you are to have 9999 entries and have vector avoid 4 or 5 wasteful copies that a less aggressive algorithm chose. Std::vector implementations are fine tuned over many hundreds of man hours to ensure optimal behavior when the user has no information on sizing.

The only time I'd start to deviate from that is when you start getting up into the gigabyte data sets. Then it's nice to make sure that you don't allocate something too big to fit into memory.

Upvotes: 1

Related Questions