I am using CVF 6.5.A on Windows 2000 / Pentium III.
I read numbers from a text file into real(8) variables, and then I print these variables.
Numbers like 1.625, 6.5 come out as 1.625000, 6.500000.
But numbers like 1.2, 1.551 come out as 1.199999, 1.550999.
What is going on? Solutions?
There seems to be a pattern, which may or may not be true: numbers appear to be faithfully carried by real(8) variables when the only prime divisor of the decimal part is 5.