Suggestions for standardizing an arbitrary numerical precision library

Suggestions for standardizing an arbitrary numerical precision library

I am thinking of an important question, which I have written to some important people, among them Brian Stroustrup, whoalso in fact answered many of my mails. Among them was several discussions of operator preferences, precedences and handling. The question is:

Much of today's software relies on simulations andheavy numericalcrunching. Calculations on f ex airplane wing design relies on heavy numerical software libraries. But there is no straightforward way to judge if the calculations are correct or not, or if the mathematical modelsare correct or not. Roundoff errors may accumulate during the calculations, which may cause unacceptable results, etc. My idea is that you should develop a numerical library that could be run with arbitrary numerical precision, better if the precision could be set at runtime. If one can increase the precision and see that the end result converges at a specific value, one can assume that the model and calculationsare correct. Of course that is only an inductive proofof correctness, not a deductiveproof. It is, however,very important to know how big the error is in the end calculation. There arestraightforward mathematical methods to do exactly that, but theseare convenient only if the calculation stepsare very few. As simulations can be run with 100,000s of steps, one can at best only guess the error. The remedy is to rerun the calculation with a higher numerical accuracy, and see if the new result matches the previous result. Especially if the run is done with successively higher accuracy, and the final results converge,one has a strong argument thatit is correctlycalculated.

Another important fact is that if you find a model, physical or chemical or scientific or whatever, that you suspect to be chaotic -- how do you really verify that it is chaotic or not?Chaotic systemsare very important and should be taken seriously, especially I find them very very interesting. One possibility in investigating a possible chaotic behaviour in a given system is to rerun the calculations with a higher precision. If chaotic behaviour is stillfound, one can suspect, per definition of the principle of logic induction, that the system is chaotic. An important conclusion, if you are using it for any practical purposes. I have myself done much mathematical software on computers and often discovered trouble with finite precision calculations. For example, Idid much work in optics and chemistry, wherenumerical precision was very important, especially error calculations.

Since you are such a big company with much influence on the software industry, I suggest that you try to standardize such an arbitrary numerical precision library. There are some such librarries that can be run on many C++ compilers, but I see the need to standardize such a library. It should be preferable if the library is notbound to be run on a specificCPU model, it should be non-CPU-specific. It is also very important to put the finger on the problems with finite precision calculationsand make peoplein the softwareindustryawareof thesituation.

About the Xeon
processor family (which I hope I can buy one or two of someday), I have the following idea - that you should increase the number of registers in theCPU to perhaps 256, that means that heavycalculations, especially threaded ones, could be done at a blazing speed. Also, today's numerical coprocessors are built on quiteoldcode designs thatwas developedduring the early 90s, andI guess much can be done to improve the code, since we todayhave much more powerfulCPUs.

My view of the computer era is that we are only in the very beginning of this era; I see immensely vast opportunities todowonders with computers. New CPU designs could include stacking or"sandwiching" ofCPU wafers on top ofeachother; perhapsone has tohavea good thermoconducting layer, I was thinking of pure copper or silver, but it might be easier to use highly pressurized helium instead. Hydrogen has the highest thermal conductivity among the gases, but since it's flammable, handling and processing might be hazardous, and helium is the bestcompromise. Even better is helium-3,the lighterisotopeof helium, but it is very rareand expensive.

Well, I hope that you find some of my ideas useful.

1 post / 0 new
For more complete information about compiler optimizations, see our Optimization Notice.