I am pleased to join this very active Intel fortran forum and posting my first issue here.
The problem is related to deallocation of the allocated arrays. The same code works seemingly fine on Mac OS X, but fails to execute successfully on Linux. Herewith I enclose the code which is giving problem. The code is basically for interfacing dstegr, a LAPACK routine.
I debug the code as following:
1) First I set ulimit -s unlimited
2) I compile the code as
ifort lapack_dstegr.f90 test_dstegr.f90 -llapack -lblas -g -traceback -warn all -check all
3) By simple ./a.out, it hangs
4) So I checked with gdb
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
Linear dimension of the matrix 100
Program has been run successfully
Number of eigenvalues: 100
6.473654E-01 3.540588E+00 8.517578E+00 1.550983E+01
Program received signal SIGSEGV, Segmentation fault.
0x00002aaaabe94c6c in __GI___libc_free (mem=0x6c6ca0) at malloc.c:2945
2945 malloc.c: No such file or directory.
I also checked with the gfortran which is suggesting that there is some problem in deallocation of the allocated arrays "w" and "subdiag" in test_dstegr.f90 at 92 and 96 line, respectively. However, I don't see any problem at these place as I have checked that the arrays "w" and "subdiag" are allocated before the deallocations occur. So this problem seems ambiguous for me. I would appreciate very much if some one shed light on this issue.