arpack: memory issue?

arpack: memory issue?

Greetings everyone. I'm rather new to fortran, so please bear with me.

I am trying to use ARPACK for an eigenvalue problem. I am wondering if there are certain compiler flags I need to use, or if I am simply at a memory limit. Here is the background:

I can compile the code just fine. I can run the test case that interests me (dssimp.f for real, symmetric matrices in double precision, for those interested). There is one parameter that controls the maximum allowable array size, maxn. If I set maxn = 8E6, the program runs fine. But if I set maxn = 9E6, I get a segmentation fault. My guess is that it is running out of memory, but I don't know for sure.

As far as limits are concerned:

$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
stack size (kbytes, -s) unlimited
cpu time (seconds, -t) unlimited
max user processes (-u) unlimited
virtual memory (kbytes, -v) unlimited

if I look at the executable for a case when it works, I get:
$ size dssimp
text data bss dec hex filename
501974 14600 1920023996 1920540570 72791f9a dssimp

For the case when it fails, I get:
$ size dssimp
text data bss dec hex filename
501974 14600 2088022844 2088539418 7c7c951a dssimp

I'm using a P4 with 2 GHz and 1 gig of RAM. I looked at stuff like -mcmodel=large, but I gather that only works for 64-bit, right?

So is there some way out? Is there something I can do to use a larger array? Some sort of dynamic allocation? Or using the swap space?

Or am I forever confined to arrays smaller than 9E6 with ARPACK?

Many many thanks!
C

1 envío / 0 nuevos
Para obtener más información sobre las optimizaciones del compilador, consulte el aviso sobre la optimización.