Huge binaries produced by 9.1 ?

Huge binaries produced by 9.1 ?

Edit: Extra information further down the thread...

Hi,

I took a first swing at C++ 9.1 running under Visual Studio 2005. After converting the project (with standard settings) to an Intel project type I started build. It is a project with approx 120 files.

The Intel compiler has always been time consuming, but this time I had to leave for the day.

Problem: The release build produces object files around 11 - 15 Megs in size each where Visual Studio would make then around 40k each. My first guess was that there was something "extra" in the files that where removed before linking.

The combined size of all *.obj files where now 1.6Gb. I find that silly.

The linker can not link such a huge binary. It now complains it cannot seek to position:

Error 578 fatal error LNK1106: invalid file or disk full: cannot seek to 0x2F0FEFA9 ......ReValverII_lib.lib

The disk is not full. It is the resulting binary that would grow too huge.

The funny thing I am not using any "special settings". This was my first try with this compiler version. Are there any other problems around the corner?

16 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

I can't guess your idea of standard settings. I suppose ICL options most similar to typical MSVC settings would be something like -O1 no vectorization, with similar SSE2 settings on both compilers, no multiple version option set. ICL always had vectorization on by default, so I can't guess whether you considered that a surprise.
They say the introduction of partial interprocedural optimization as a default in 9.1 was a response to it being present in the Microsoft compiler. If it creates a problem for you, you are entitled to post a report on premier.intel.com, where you would need an example and the specifics you don't care to divulge here.

I have been using every compiler since C++ 4.5 and used pretty much the same settings since. I find it curious that the object files regardless of content tend to be of that specific size 11-14M. The actual settings used seem to be irrelevant to the compile size and even if I managed "stumble upon the wrong switches" it suspicious that they all yield the same result however I vary them.

Even in extreme cases you would not get a binary of 1.6Gb instead of the expected 1.5Mb. This is beyond extreme.

I also doubt the code itself is the source of the problem.

/Michael

Our dll sizes have stayed fairly constant from version 7.1, 9.0, and 9.1. Object files tend to be a few hundred K each.

If every object file in your project is uniformly large as you claim, I would look closely at your pch settings (pre-compiled header) or other included files.

Chris

Thanks for the advice. I tried to turn off pre-compiled headers all together but it has no impact on the compile size at all. But speaking of which; the file "precompile.cpp" which I use to create the pch, creates an object file of the same size, i.e 11 Mb, and this file in itself contains no code.

I had high hopes we could use the Intel compiler for both Windows and OSX now when it available for both platforms. But because C++ 9.1 took so incredibly long to release in the first place I took that time to familiarize with the native compiler in Visual Studio 2005 and it's not half bad. As it looks now, we'll probably release using Microsofts compiler instead.

Thanks anyway.
/Michael

I am bumping this thread.

Using version ICC 9.1 034 I still get the same problem. I now know it is because of the "Whole program optimization" option checking the "Use link time code generation".

As the project size has grown since I first started this thread, the total object file directory is closer to 2.7Gb. The error message is now:

xilib: executing 'lib'

F:DevstudioProjectsAudioPluginsabcRelease/abc_lib.lib : fatal error LNK1106: invalid file or disk full: cannot seek to 0x332F59C4

The projects target is a static library. It is not generated. Is there a file size limit for the temporary files?

By chance, are you unintendedly linking in a huge resource file? I don't mean "Do you have a huge resource file?", rather I mean is something you do not intend, being treated as and being linked in as a resource file?

Jim Dempsey

I don't think so. The target I am building is a static library, and I don't think resources can be linked into that?

The only clue I have is the size of all individual object files, resulting in 13-15Mb, regardless of content, and that these sizes are only present when the "Whole program optimization" is used. (Hence I can't use it...)

Thanks for your answer Jim.

Michael,

Object files contain not only code and data but also "comments".

The "comments" are subject tointerpretation bythe tools that created them.

Typicallywill beSource fileName, Date, Compiler Name, Compiler Version, Debug Information, Edit and Continue information, andObject Browser information. As well as additional "stuff".

Try this:

Pickyour smallestsource file which you think willproduces the smallest .OBJ file. Assume its name is foo.obj. Issue the common line:

DUMPBIN /ALL /DISASM foo.obj > foo.lst
NOTEPAD foo.lst

Replace "foo" with your file name. See what you find.

Jim Dempsey

JimDempseyAtTheCove:

Issue the common line:

DUMPBIN /ALL /DISASM foo.obj > foo.lst
NOTEPAD foo.lst

Well... I don't know what to look for. The smallest file becomes 77kB when whole program optimization is off, and 11.3MB when on.

Doing the dump you suggest creates a 56MB large text file which I can't interprete. Do you have any suggestion what to look for?

Thanks :-)

That is a size ratio difference of 146.75 to 1.

This is unusual to say the least.

What this sounds like is a confluence between the style of your programming and the options selected. (Note, I am not criticizing your programming style.)

Something like

a) compile to make the program to run as fast as possible
+
b) compile to make the program run on the widest set of platforms and hardware support
+
c) sacrifice size for speed
+
d) unroll loop size set very large
+
your code is predominantly a bunch of loops that can be unrolled and be compiled using SSE3 instructions and be capable of running on platforms without SSE3.

i.e. the loops are being unrolled, have multiple paths dependent on platform, have unknown alignment of input args (this creates special loop start and end sections to iterate until alignment of data).

Something has to account for the bloat.

If you cannot figure this out, then you may need to send the "small" source file and headers to Premier Support for examination.

Jim

JimDempseyAtTheCove:

That is a size ratio difference of 146.75 to 1. This is unusual to say the least.

What this sounds like is a confluence between the style of your programming and the options selected....

...

...loops that can be unrolled and be compiled using SSE3 instructions and be capable of running on platforms without SSE3.

...

Something has to account for the bloat.

...

Thanks Jim. I suppose I will send Intel support something to look at. A few other clues are that the extra "bloat" does not seem to be code, but something pseudo textual. Even the precompile.cpp (containg no code in the source file) generates a file of 12MB. I am using standard options, which seem to be "fast" but none of the points you mention above.

Strange ideed. Thanks for your help :-)

Hold on, we have the clue right here, "whole program optimization". I notice a massive explosion of object file sizes and so on with that kind of optimization. It comes as not suprise to me, and should not be seen as a particular "bug" in the Intel compiler. That kind of optimization results in a combinatorial explosion of things the compiler has to keep track of and check. The IPO files get huge. The Viusal C++ compiler does nothing quite as sophisticated.

Depending on compiler options settings the bloat may be the debug informationforthe include files. If not the debug information then it is likely the information to link in the external symbols specified in the include files but not referenced by your application.

IMHO unreferenced externals should not be in the .OBJ file. I imagine there is a tool to strip this junk out of the .obj file.

Jim Dempsey

GNU has an OBJCOPY utility. There might be something similar for Windows

http://www.gnu.org/software/binutils/manual/html_chapter/binutils_3.html

Jim

I expect that using the "whole program optimization" (/Qipo, I presume)is the primary issue here. When you use that a lot of information is carried along in the objects and libs to do more optimization before producing the final executable. This should not affect the final executable, in fact it may even be smaller in some cases. Of course, if the static library is your final product, I would recommend not using "whole program optimization", because this extra info won't be used anyway, when someone else is using the lib to build in their environment. You could try /Qip in place of /Qipo.

I'm told that a 50x object size increase is not unusual, though 150x may be a little high. In particular if you can provide a test case with the seek error, please submit a bug to premier.intel.com.

Dale

Leave a Comment

Please sign in to add a comment. Not a member? Join today