I noticed that my project behaves differently if I compile it with default debug settings compared to when using release settings.
In debug it works; in release the calculations differ slightly and lead to a failure in one of the checks of my program (it's a program for engineering calculations, more specifically process simulation).
After trying to level the differences between the two configurations on the aspect of how floating point calculations are carried out (I thought the problem could lie in optimizations or floating point speculation), I have simply solved the problem by compiling in release using the /Zi option (Debug Information Format).
The question is...why?? What's the role of debug information in the calculation of floating point operations?!
Thanks in advance.