Reputation: 1673
I need to allocate a fairly large chunk (or chunks) of memory - several gigabytes. But if I try to allocate a float array of more than 532000000 elements(~2 Gb), I get a runtime error:
terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc
This is ok:
float* d = new float[532000000];
But this is bad (bad_alloc exception):
float* d = new float[533000000];
Then I tried to allocate another array in addition to the first. It was found that the maximum size of second float array is 195000000 elements (~748 Mb).
This is ok:
float* d = new float[532000000];
float* e = new float[196000000];
This is bad:
float* d = new float[532000000];
float* e = new float[197000000];
I would like to know what are the limitations to the allocated memory in an application and how to avoid them? How can use virtual memory?
My system - 32-bit ubuntu 12.10, compiler - gcc 4.7, RAM - 8GB (~6.5 Gb free)
Upvotes: 2
Views: 3933
Reputation: 126957
You hit the limit of the virtual address space; even if you do have enough physical RAM (that the OS probably can access via PAE, using 36-bit pointers), on a 32 bit system each process still has a 32 bit virtual address space, which means that each process can't map in memory more than 4 GB of memory.
Since usually the upper half of the virtual address space (or the upper 1 GB, it depends from kernel settings) is reserved for the kernel, you will typically have the allocation limit set to ~2 GB, and virtual address space fragmentation can lower this number.
There are various workarounds (for example, on Windows you can use memory-mapped files larger than 4 GB, mapping only a portion of them at a time; probably on Linux you can do the same), but currently the simplest solution is just to move to a 64 bit OS and recompile the application for 64 bit.
Upvotes: 5