user3979554
user3979554

Reputation: 15

dynamic memory allocation using new with binary search in C++

I am trying to find the maximum memory allocated using new[]. I have used binary search to make allocation a bit faster, in order to find the final memory that can be allocated

bool allocated = false;
int* ptr= nullptr;
int low = 0,high =  std::numeric_limits<int>;
while(true)
{
    try
    {
      mid = (low + high) / 2;
        ptr = new int[mid];
        delete[] ptr;
        allocated = true;
    }
    catch(Exception e)
    {....}
     if (allocated == true)
    {
        low = mid;
    }else
    {
        high = low;
        cout << "maximum memory allocated at: " << ptr << endl;
    }
}

I have modified my code, I am using a new logic to solve this. My problem right now is it is going to a never ending loop. Is there any better way to do this?

Upvotes: 0

Views: 416

Answers (3)

n. m. could be an AI
n. m. could be an AI

Reputation: 119847

This code is useless for a couple of reasons.

  1. Depending on your OS, the memory may or may not be allocated until it is actually accessed. That is, new happily returns a new memory address, but it doesn't make the memory available just yet. It is actually allocated later when and if a corresponding address is accessed. Google up "lazy allocation". If the out-of-memory condition is detected at use time rather than at allocation time, allocation itself may never throw an exception.
  2. If you have a machine with more than 2 gigabytes available, and your int is 32 bits, alloc will eventually overflow and become negative before the memory is exhausted. Then you may get a bad_alloc. Use size_t for all things that are sizes.

Upvotes: 1

Bathsheba
Bathsheba

Reputation: 234645

This is a particularly bad test.

For the first part you have undefined behaviour. That's because you should only ever delete[] the pointer returned to you by new[]. You need to delete[] pvalue, not value.

The second thing is that your approach will be defragmenting your memory as you're continuously allocating and deallocating contiguous memory blocks. I imagine that your program will understate the maximum block size due to this fragmentation effect. One solution to this would be to launch instances of your program as a new process from the command line, setting the allocation block size as a parameter. Use a divide and conquer bisection approach to attain the maximum size (with some reliability) in log(n) trials.

Upvotes: 0

Rocky Pulley
Rocky Pulley

Reputation: 23301

Assuming you are doing ++alloc and not ++allocation, it shouldn't matter what address it uses. if you want it to use a different address every time then don't delete the pointer.

Upvotes: 0

Related Questions