I recently encountered a problem that I have not seen before, and am not sure I understand. I have been working on some code for around 10 years now. This code is all compiled into a single executable file. The program is designed to undertake repeated numerical optimisations a large number of times (running into the billions), and stores results in a series of allocatable arrays (to avoid stack over-flow issues). I also use a large number of allocatable variables as globals, which are referenced throughout the code. I recently found that adding a single global allocatable array to my program now produces unreliable results. Specifically, if you run the program without the addition of (global) allocable variable X, then you get some answer for a particular optimisation Y1. If you then add the allocatable variable X, and allocate but do not use it, then you get the same result Y1 if teh revised code is run through VS. But if you then run the revised code outside of the debugger, then you get result Y2.ne.Y1 - using the same compiled version of the code (ie, not recompiling, just double clicking the executable rather than launching it through the debugger).
This is obviously a non-trivial problem for me. Could this be happening because I have reached some sort of technical limit in relation to the use of allocatable variables? If so, then would cutting my program up into pieces resolve the issue? Any other comments or suggestions would be most welcome.
I am using IVS 18.104.22.168 in VS 2010 and do not provide my compiler options here, as the same result seems to crop up in both my development and release configurations.