Reputation: 1
I have searched about the famous error of "std::bad_alloc" but I couldn't find any case like mine. I've implemented a code in c++ (to be accurate: running in MinGW with Eclipse in Win 7 Prof.) and the code runs great for small number of class instances, but when the number of class instances reaches over 10509, the error from windows appears and then eclipse shows the 'std::bad_alloc' error. I am using "new" to build the new class instances and each class is needed two pointers and three variables of types "long int" , "string" and "int". I can not deconstruct any instances build before because I need them as nodes later on to be evaluated on some other part of program. But, I have added class deconstructor anyway to class definitions and they set properly for sure. I am sure about when and by which command the program stops, which is exactly after constructing 10509 class instances right when trying to add the 10510th class instance.
So I am wondering if it is possible to extend the reserved memory to allocate new class instances or not?
The code is too long to be pasted here. Thank you for any help!
Upvotes: 0
Views: 1205
Reputation: 320747
Firstly, the dynamic memory in which new
works typically occupies all remaining available memory in your process. There's no way to extend it further. How much dynamic memory you had initially depends on how much non-dynamic data your program has. There's no way to say without knowing more about your program. By reducing the size of non-dynamic data you might effectively "extend" what will be available as dynamic memory.
Secondly, on platforms with virtual memory it is not really possible to "run out of memory" (assuming you have a healthy swap file). But you can run out of process address space. Again, if you run out of address space after allocating only 10509
objects, something else must be wrong.
Thirdly, 10509
looks like a really low number. So, if you run out of memory after allocating so few objects it probably means that the objects themselves are rather large. How large are your objects?
Fourthly, dynamic memory allocation errors might be triggered by heap corruption, meaning that allocation will fail even if there's technically plenty of free memory still left available. There's no way to say without knowing more about your code.
Upvotes: 2