Reputation: 655
I'm running a script that starts similar to the code below. In the past when I ran the script f90 -o fakefile fakefile.f
and then ./fakefile
it worked, but now it does not and immediately returns a segmentation fault (core dumped)
error. When I use gfortran fakefile.f
and then ./a.out
the code runs fine. I cannot figure out what the discrepancy is between using the two compiling methods.
program fakefile
implicit real*8(a-h,o-z)
parameter(im9=4320,jm9=2160)
parameter(im1=360,jm1=180)
parameter(im25=1440,jm25=720)
parameter(nlt=2)
real*4 rrs(im9,jm9,nlt)
real*8 rrsa25(im25,jm25,nlt)
real*8 area25(im25,jm25,nlt)
real*8 rrsa1(im1,jm1,nlt)
real*8 area1(im1,jm1,nlt)
rrsa1 = 0.0
area1 = 0.0
rrsa25 = 0.0
area25 = 0.0
rrs = 0.0
print *, rrs
end
Upvotes: 1
Views: 428
Reputation: 6915
Your segmentation fault is due to overlowing the stack with your large arrays. This is a common issue on Absoft and Intel Fortran compilers. For your compiler (Absoft), use the -s
flag to tell the compiler to allocate arrays on the heap instead of the stack. The alternative is to increase your stack size limit in your shell (which may be restricted by the administrator).
See the Absoft FAQ: When I declare large arrays (>8 MB of variables), I get a segmentation violation from Linux.
A. Use the "-s" compiler option (static storage) to move the data from the stack to the heap or use the ulimit command (ulimit is a bash command - the csh equivalent to 'ulimit -s' is 'limit stack') to raise the stack size limit
# ulimit -s 8192
# ulimit -s 32768
# ulimit -s 32768Once raised the limit applies to the current process and any children of that process.
Upvotes: 1