Reputation: 379
I'm allocating memory for three very large arrays (N = 990000001). I know you have to allocate this on the heap because it's so large, but even when I do that, the program keeps crashing. Am I allocating it incorrectly or is my computer simply not have enough memory (I should have plenty)? The other thing that may be the problem is that I'm somehow allocating my memory incorrectly. The way I'm allocating memory right now works perfectly fine when N is small. Any help is appreciated.
int main()
{
double *Ue = new double[N];
double *U = new double[N];
double *X = new double[N];
for (int i = 0; i < N; i++)
{
X[i] = X0 + dx*i;
Ue[i] = U0/pow((X0*X[i]),alpha);
}
//Declare Variables
double K1;double K2; double K3; double K4;
//Set Initial Condition
U[0] = U0;
for (int i = 0; i < N-1; i++)
{
K1 = deriv(U[i],X[i]);
K2 = deriv(U[i]+0.5*dx*K1,X[i]+0.5*dx);
K3 = deriv(U[i]+0.5*dx*K2,X[i]+0.5*dx);
K4 = deriv(U[i]+dx*K3,X[i+1]);
U[i+1] = U[i] + dx/6*(K1 + 2*K2 + 2*K3 + K4);
}
return 0;
}
Upvotes: 1
Views: 2692
Reputation: 1854
If on Windows, you may find useful this information Memory Limits for Applications on Windows -
Note that the limit on static and stack data is the same in both 32-bit and 64-bit variants. This is due to the format of the Windows Portable Executable (PE) file type, which is used to describe EXEs and DLLs as laid out by the linker. It has 32-bit fields for image section offsets and lengths and was not extended for 64-bit variants of Windows. As on 32-bit Windows, static data and stack share the same first 2GB of address space.
Then, the only real improvements -
Dynamic data - this is memory that is allocated during program execution. In or C or C++ this is usually done with
malloc
ornew
.64-bit
Static data 2Gb
Dynamic data 8Tb
Stack data 1GB (the stack size is set by the linker, the default is 1MB. This can be increased using the Linker property System > Stack Reserve Size)
Allocation of single array "should be able to allocate as large as the OS is willing to handle" (i.e. limited by RAM and fragmentation).
Upvotes: 0
Reputation: 213837
Your program allocates and uses about 24 GB of memory.
If you are the program as a 32-bit process, this will throw std::bad_alloc
, and your program will exit gracefully. (Theoretically there could be an overflow bug in your toolchain, but I think this is unlikely.)
If you are the program as a 64-bit process, you might get snagged by the OOM killer and your program will exit ungracefully. Unless you have 24 GB in combined RAM + swap, then you might churn through at the speed of your disk. (If you actually have 24 GB of RAM, then it probably wouldn't crash, so we can rule this out.) If overcommit is disabled then you will get std::bad_alloc
instead of the OOM killer. (This paragraph is kind of Linux-specific, though other kernels are similar.)
Solution: Use less memory or buy more RAM.
Upvotes: 3