Reputation: 12992
I want to write code that is somewhat resistant to running out of memory. How I'm trying to go about it, is if any memory allocation fails, it will pause execution at that point and ask the operator if they can attempt to free some memory before re-attempting the allocation (or should it prove impossible, they can choose to terminate the program themselves, it's up to them)
My code for this so far seems pretty ugly though. This is the block that proceeds any std::vector operation that could potentially expand the array:
while(pointVector.size() == pointVector.capacity){
// will not break past this if the while statement remains true
// ERROR.report() has the power to kill the program if it needs to
try{
pointVector.reserve(pointVector.capacity * 2); // edited
}catch(...){
ERROR.report(Error::Severity::Memory
, __LINE__, __FILE__
, "Failed to allocate enough points"
, pointVector.size(), 0, 0);
}
}
pointVector.push_back(point);
The ERROR
object is specially pre-allocated all of it's resources so it can ask the operator without causing any new problems (in theory). My question is, is there a better form this can take? Does C++ have 're-try' logic for this kind of situation? Or is this pretty much how it should go?
Upvotes: 3
Views: 181
Reputation: 1211
The function you are looking for is std::set_new_handler(new_handler new_p)
It allows you to specify a method that is called if an allocation is about to fail because your program ran out of memory. This method then has the opportunity to free some memory so the allocation can succeed. If the method is able to free some memory it should return true, otherwise it has to throw a bad_alloc exception or terminate the program. (The documentation for set_new_handler
has more specific information.)
One nice feature of using set_new_handler
is that you then don't have to wrap every call to new in a try/catch block to make sure you haven't run out of memory.
Upvotes: 1
Reputation: 36082
I think your best approach would instead be to allocate memory in a memory pool and use memory from there instead, that way you will have full control of the memory and can do some cleaning up like defragmentation if you run out of memory. Doing a recovery the way you propose would IMHO just delay the inevitable.
Upvotes: 0
Reputation: 96233
Unless you have truly exceptional requirements you aren't discussing inyour question, it's not unreasonable for an application to simply exit if it's memory needs are not met: Let the user free up memory and run your program again. This avoids writing extra code that won't be needed in 99.9% of cases, and in this case, that can drastically affect your performance.
For example, normally push_back
runs in amortized constant time, but your "grow + pushback" combination would actually run in linear time due to linearly increasing the size of the container. This huge performance decrease would affect all your users while providing benefit for only those tiny fraction that experience running out of memory.
Upvotes: 2
Reputation: 129314
According to the principle of "you only pay for what you use", C++ does not "repeat" or "try again" by itself.
Whether it is actually worth trying again in such a case is this is of course another matter... Unless you have a system where you are running something system critical, and low memory operation is a big part of the situations that you are required to handle, I'd say it's probably WORSE to try again than to bail out.
You may also check if the error is something other than std::bad_alloc
[or whatever the "you can't allocate" is called], as retrying if there is some other error is probably pretty pointless.
You should definitely set a limit for the number of times you repeat the loop (just in case there is a bug in ERROR.report()
).
[Growing by 32 at a time is fine as long as your list of items is fairly small - but I have seen code almost identical to this cause problems, because by growing "only" 32 at a time to a total size of 32MB, it causes the data to be copied a gazillion times, making the application appear like it had hung, and thus got a bug-report from a user. Changing it to a geometric growth (double each time) fixed that particular bug].
Upvotes: 2
Reputation: 5823
Generally, no, C++ does not have a built-in way around this. Running out of memory is not something a programming language can usually accommodate, so it will fail, assuming that continued functionality is not possible without the desired memory. Since there is no intrinsic 're-try' logic, you're left with something like what you have. Sorry, there isn't a particularly clean method here.
Upvotes: 3