Reputation: 277
I wrote test code in C++:
void handler()
{
std::cout << "allocation failed" << std::endl;
std::set_new_handler(nullptr);
}
int main()
{
size_t allocations_count = 0u;
std::set_new_handler(handler);
try {
while (true) {
new char[1024u * 1024u * 1024u];
++allocations_count;
}
} catch (const std::bad_alloc& e) {
std::cout << e.what() << '\n';
}
std::cout << "allocated " << allocations_count << " GB" << std::endl;
}
In my machine with 8GB RAM, I got the output:
BigAllocations(5125,0x1050e7d40) malloc: can't allocate region
:*** mach_vm_map(size=1073741824, flags: 100) failed (error code=3)
BigAllocations(5125,0x1050e7d40) malloc: *** set a breakpoint in malloc_error_break to debug
allocation failed
BigAllocations(5125,0x1050e7d40) malloc: can't allocate region
:*** mach_vm_map(size=1073741824, flags: 100) failed (error code=3)
BigAllocations(5125,0x1050e7d40) malloc: *** set a breakpoint in malloc_error_break to debug
std::bad_alloc
allocated 130676 GB
I allocated 130676GB, I think it's too much, because my storage size is 500GB and I don't think, that virtual memory can server this number of memory... Can anyone explain why I can make a lot of allocations there?
RAM = 8GB Storage = 500GB MacBook Pro (13-inch, M1, 2020)
Upvotes: 1
Views: 828
Reputation: 62797
64 bits can address 9313225 GB of memory. This is an order of magnitude more than 130675 GB you allocated.
So virtual memory can easily fit the amount of memory you allocated, even with all special address ranges reserved by the OS. As long as you do not write to it, it will contain zeroes and does not need to be stored anywhere (note, C++ language new
does not guarantee memory contains zeroes, but it obviously allows it).
Initialize the memory you allocate to non-zero and see what happens then. On Linux this would trigger OOM (out-of-memory) killer, which would start terminating processes, when the OS runs out of physical memory and disk swap to store the data. Maybe OSX does something similar.
Upvotes: 8