i have a question about the f discriptor in read statements. The standard format is rFw.d (r==repeat count, w==field width and d==number of digits).
If i read the chapman book correctly d is a 'nice to have' (if not present the compiler give an error) but to compiler will correct it if needed.
(eg. 00000.1501 with 1f10.2 will be read correctly as long as there is a decimal point and the number is covered by the kind of the real value).
Is this a standard behavior? or just a specific behavior form IVF?
A second question: the field width (w) value will give the part of the read statement which will be interpreted. This is fix? Only the position of the decimal point in the given field width is variable?
Thanks in advance