differences betwwe gcc and icc

differences betwwe gcc and icc

I have a small program thta gives different output when compiled and run with gcc and then icc. The program is simple, but is a number cruncherso maybe (hopefully) I can find out what swithces on the icc compiler need to be changed so the output from both compilers matches. maybe at least i can find out why they are different.

Is there any literature on this? A white paper maybe?

Any help appreciated.Thanks in advance.

Newport_j

5 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

This forum is for Cilk Plus questions. We don't know this information.

You should ask this question in the Intel C Compiler forum. They're more likely to have the information you're looking for.

- Barry

The defaults of gcc and icc are not at all comparable. If you are looking for closest numerical match, without concern for full performance of all cases, you would set icc -fp-model source and compare with gcc -O3 -march=core2 or some such. Even a white paper couldn't cover all the common possibilities such as comparing an obsolete gcc against a current icc.

Now wait a minute. Your response is neither helpful or fair.

The computer example that I discussed in the initial email is found in a operations research textbook. WhenI programmed this a long time ago using MSc the numerical output was exactly the same as in the textbook. In fact it is a program of discreet event simulation originally run on an IBM 365 mainframe and then programmmed by operations research students (of which I was one)on a PC using Microsoft cCompilers, it was also originally done in Fortran on a mainframe and a pc using (MS Fortran)for the textbook. All output was the same.

Now when we moved on to simulation languages the proof of their utility and accuracywas the output that matched the c and Fortan output. An initial simulation language was SIMLIB, but there are many others. This was done at the University of Michigan, Ann Arbor, and the professor wasPh.D. from Berkeley.We took it seriously and we were careful. Each time the program ran in all of its versions the output was the same; maybe it was a little off in the sixth decimal place, but it was rare.

I just read an article that stated clearly (from an Intel competitor) that computers before the IEEE standardwas put into place tended to give different numericalanswers. Hence, theimplementation of an IEEE standard.We did not think it unusual since it was nice to know that no matter what system or compiler we ran a program on, the ansers were consistent.

Until now.

I am very bothered that the icc compiler (as goodas it is) gives a different answer. I am certain that when I present this or at least send out preliminary rough drafts of my presentation to selected reviewers the question will come up.Why does this compilergive different numerical output.

I will need to answer it.

Please note thatI am only using the icc compiler as a single core here, no parallelism. I believe in it, I have promoted Intel Cilk Plus as a solution to many of our slow running programs that we want to speed up.This different outputhas been an nagging issue to me for a long time and I must give a credible answer.

Thanks for reading my rant here, and please help me to explain this dsicrepancy.

Newport_j

I think what Tim is saying is that this is a complex topic. And the people who monitor the Intel C Compiler forum are more likely to have the details you're looking for. I recommend you ask this question there.

- Barry

Leave a Comment

Please sign in to add a comment. Not a member? Join today