Reputation: 2859
Hi I have the following:
struct myStructure
{
vector<int> myVector;
};
myStructure myArray[10000000];
As you can see I have a very large array of a vectors. The problem is that i dont have a priori knowledge of the number of elements I need to have in the array, but I know that 10 million elements is the max i can have. I have tried two things:
a) make myArray a global array, however the problem is that i have a function that will access myArray many many times, which is resulting in memory leaks and the program crashing for large calculations.
b) declare myArray dynamically from within the function that needs to access it, the memory is kept in check but the program runs about 8 times slower.
Any ideas on how to address this issue. Thanks
Upvotes: 2
Views: 1467
Reputation: 11
The best solution I can find is to call the function "malloc" which reserves space in "heap memory", in the array case you should code something like:
int* myArray = (int*) malloc ( sizeof(int)* Len );
..after that, don't forget to liberate heap memory using free(myArray); it's a powerful tool to make arrays super large.
Upvotes: 1
Reputation: 3733
Use a different data structure. I'd suggest trying something like one of the sparse matrix classes from Boost. They are optimised for storing numeric data in which each row or column contains a significant number of zeroes. Mind you, if the problem you're trying to solve isn't suitable for handling with a sparse data structure, it would be a good idea to set out the nature of the problem you're trying to solve, in greater detail. Take another look at https://stackoverflow.com/questions/how-to-ask even though I guess you already read that.
But before you do that I think you probably have another problem too:
access myArray many many times, which is resulting in memory leaks and the program crashing for large calculations
It looks to me from what you write there that your code may have some pre-existing bugs. Unless your crashes are simply caused by trying to allocate a 10000000-element array as an auto
variable.
Upvotes: 0
Reputation: 88235
access myArray many many times, which is resulting in memory leaks and the program crashing for large calculations
You should fix those bugs in any case.
the memory is kept in check but the program runs about 8 times slower
Since you're already using dynamic allocation with an array of vectors it's not immediately obvious why dynamically allocating one more thing would result in such a slowdown. So you should look into this as well.
Then I would go with a vector<vector<int>>
that isn't global but has the appropriate lifespan for its uses
#include <vector>
#include <functional>
#include <algorithm>
using std::vector;
int main() {
vector<vector<int>> v;
for(int i=0;i<100;++i) {
std::for_each(begin(v),end(v),std::mem_fn(&vector<int>::clear));
foo(v);
for(int j=0;j<100;++j) {
std::for_each(begin(v),end(v),std::mem_fn(&vector<int>::clear));
foo(v);
for(int k=0;k<100;++k) {
std::for_each(begin(v),end(v),std::mem_fn(&vector<int>::clear));
foo(v);
for(int l=0;l<100;++l) {
std::for_each(begin(v),end(v),std::mem_fn(&vector<int>::clear));
foo(v);
}
}
}
}
}
Upvotes: 3
Reputation: 845
Did you try turning your array of vectors into a vector of vectors? Not knowing how many of an item you will need is what vectors are for, after all.
I believe it would be
vector<vector<int>> myVecs;
Upvotes: 0
Reputation: 98118
Declare this structure in an object with a lifetime guaranteed to surpass the objects that access it and use a reference to access this object. Ideally, you should have a class in your hierarchy that calls all the functions dealing with this struct, so these functions may well be members of your large array of vectors.
Upvotes: 0