Reputation: 3460
My C++ algorithm obtains data with unknown size (it detects particles on the image one by one, and I cannot know how many particles will be detected before this algorithm finishes its work). So, first I want to allocate, say, array with 10000 elements, and during the processing, if necessary, allocate another 10000 elements several times.
Here is what I tried, it doesn't work:
#include <iostream>
using namespace std;
int main(){
int n = 3;
int m = 3;
float *a = new float[3];
a[0] = 0;
a[1] = 1;
a[2] = 2;
float *b = a + 2;
b = new float[3];
b[0] = 4;
b[1] = 5;
cout << a[3] << endl;
}
As a result, I got minus infinity. Of course, I can handle this in different arrays, I can allocate a huge amount of memory once. I need to pass full array of detected data to the function after, so, as a result, I want to have one big array.
But still, is there a way to increase the size of your dynamically allocated way? What I want in a toy example is to increase number of elements in array a
by 3, so it will have 6
elements.
In Matlab it is absolutely possible. What about C++?
Thanks
Upvotes: 2
Views: 7828
Reputation: 50111
You should just use std::vector
instead of raw arrays. It is implemented to grow efficiently. You can change its size with resize
, append to it with push_back
or insert a range (or various other things) with insert
to grow it.
Changing the size of a manually allocated array is not possible in C++. Using std::vector
over raw arrays is a good idea in general, even if the size does not change. Some arguments are the automated, leak-proof memory management, the additional exception safety as well as the vector
knowing its own size.
Upvotes: 8
Reputation: 7603
You should use a vector
and then resize
when necessary or let it grow by itself.
When you do:
float *b = a + 2;
b = new float[3];
the memory allocated will not be allocated contiguously to the first allocation even if you previously set the pointer to point at the end(it will be overwritten anyway). Therefore, when accessing a[3]
, you get out of bound.
Upvotes: 0
Reputation: 254751
No, you can't increase the size of an array. If you want to use an array, you'll have to allocate a new block, large enough for the whole new array, and copy the existing elements across before deleting the old array. Or you could use a more complicated data structure that doesn't store its elements contiguously.
Luckily, the standard library has containers to handle this automatically; including vector
, a resizable array.
std::vector<float> a(3);
a[0] = 0;
a[1] = 1;
a[2] = 2;
// you can resize it by specifying a new size
a.resize(4);
a[3] = 3;
// or by appending new elements
a.push_back(4);
Upvotes: 6