Vincent
Vincent

Reputation: 1371

C++ Segmentation fault std::array

I have approx. 2 GB of free DRAM on my computer. Compiling either a std::array or the standard array:

#include <iostream>
#include <array>

int main(int argc, char *argv[]){

    // int* a = new int[500000000];
    std::array<int, 2000000> a;

}

with:

$ g++ -std=c++11 main.cpp -o main
./main

works for both arrays. Changing the size of the std::array to:

// ceteris paribus 
std::array<int, 2095300> a; 

leads to:

$ ./main
Segmentation fault (core dumped) 

honestly, I am not sure whether or not this issue has already been addressed somewhere.

From my understanding, the std::array is created on the stack, and the int * ... array on the heap. Now my guess was that maybe my stack is simply not larger then the ~8mb, which compared to the 2 GB heap sounded disproportionate. Thus I also tried out:

//int a[2096000];

which also causes a segmentation fault. So my question is, what causes the Segmentation fault?

Thank you in advance.

Upvotes: 3

Views: 2921

Answers (1)

bames53
bames53

Reputation: 88155

You're putting a large array on the stack, causing the stack to overflow.

You can set how large the stack is: Change stack size for a C++ application in Linux during compilation with GNU compiler. A better option is probably to use the heap, however.

It just sounded a bit disproportionate that the stack is so much smaller then [sic] the heap.

The stack is memory that's actually allocated, which means you don't want it to be larger than you really need, because if memory is used for the stack then it won't be available for other uses. The heap, on the other hand, doesn't take up memory unless it's actually requested, so allowing the heap to potentially take up a large proportion of the address space is fine.

The stack also generally doesn't need to be very large because the maximum depth of function calls usually isn't that high. A few megabytes is almost always more than enough.

Upvotes: 5

Related Questions