Linux Libraries Lost and Found

Developing an application to run on Linux systems is, for the most part, fun and rewarding.  I always enjoy using new tools and seeing how much I can get the tool to do for me.   But after the construction is complete, you need to test it on the intended target devices.  I’ve been developing an application to run on MID devices.  There are several operating systems being used for MID devices, but the current favorite seems to be MIDinux.  It is not based on Ubuntu Linux, which is what I’ve been developing on.  It is based on Red Flag Linux.  Still, shouldn’t be hard.  They are both Linux, right?

 Actually, it isn’t that easy for someone unfamiliar with the different flavors of Linux.  These two operating systems use different package installation tools.  You have to learn how to use both, and then you have to find the right packages to install.  Dependencies were haunting me for quite a while, and I’m not talking about foreign oil.  My tool needs two packages to be installed before use.  Only, on a clean system, one of those two packages needs 2 other packages and the second package needs 3.  And so it goes.  Eventually, you end up with a long list of libraries that you need to install in each new system, or you learn what I learned this week.  You bundle the needed libraries into the distribution package.   Now, there are at least two ways to accomplish this.  First, you can add the libraries as static libraries in your source code.

 I started with this method, but with the Anjuta IDE, even though I marked the libraries as being static, didn’t include them in the executable.  This might have been my fault, although I had a team member look as well, and we couldn’t find anything to change.  I also <gasp> checked the user manual for Anjuta and tried to follow it, but again I ended up frustrated.  So, I went to the Linux guru in our department and he gave me some pearls of wisdom which I will pass along to those of you who, like me, are fairly new to Linux.

 A second possible way to add the libraries is to include them in your distribution tarball in a separate folder.  Here is the way I was shown:

    1. Create two folders, named “bin” and “lib”.  Actually, you can name them anything you want, but the example below uses those two names.  Put the binary in the “bin” folder and the libraries that need to be included in the “lib” folder.  To easily get the libraries into the bin folder, first run the “ldd” command on the binary:           

$ ldd <binary name>



You should see something like this list although it will probably be longer:      =>  (0xb7f8d000) => /usr/lib/ (0xb7f5d000) => /usr/lib/ (0xb7f27000) => /usr/lib/ (0xb7f21000) => /lib/tls/i686/cmov/ (0xb7f18000) => /usr/lib/ (0xb7f0e000)


This is the list of libraries that your program depends on. You can use any method you prefer to put all those libraries into the “lib” folder.  Our guru suggested this shell command:

$ for i in `ldd <binary name> | cut -f2 -d"=" | cut -f2 -d" " | grep .so`; do cp i ../lib/; done


The above command assumes that you are in the “bin” folder where you have put your binary.  

  2.     Now, we want to force the OS to look in our “lib” folder first, rather than the default location, so we change the LD_LIBRARY_PATH environmental variable to our new folder:

$ LD_LIBRARY_PATH=../lib ldd ./<binary name>





This will result in the following list:
 =>  (0xb7f8d000) => ../lib/ (0xb7f5d000) => ../lib/ (0xb7f27000) => ../lib/ (0xb7f21000) => ../lib/ (0xb7f18000) => ../lib/ (0xb7f0e000)








As you can see, the system is now finding these libraries in our “lib” folder.  The one problem with this method is that we have to set that environmental variable every time the program is run.  The common solution for this is to create a script that is used as a “wrapper” for your application.  Here’s how to do it:

3.    To create the script:   

    •  Rename your binary to something else, such as <binary name>.bin              

$ mv <binary name>  <binary name>.bin

    •  Create a shell script with the original name of the binary:             


#Get full path to this script


dir=`dirname $filename`

cwd=`cd $dir; pwd`

 #Execute binary, after setting variable and pass all args to binary

LD_LIBRARY_PATH=$cwd/../lib "$cwd/<binary name>.bin" $*

    • Change the permissions on the script to make it executable:

$  chmod +x <binary name>



Armed with this new technique and with a renewed sense of hope, I loaded my tool onto a new build of MIDinux.  The first time I ran it, this is what I got:

****MEMORY-ERROR****: <binary name>.bin[19643]: Gslice: assertion failed: sys_page_size >= 2 * LARGEALIGNMENT

Slightly disappointing, wasn’t it?  The key to this problem was in the “lib” folder.  When using the “ldd” command, I got ALL the libraries that the application depended on.  MIDinux seems to prefer to use the default library location for the libraries it already has rather than using them in another location.  Or, maybe it just doesn’t like two copies of the same library.  Whatever the reason, I found that by removing all of the libraries that were already loaded on the OS from the “lib” directory I was able to run my tool successfully.  Whew! 

Well, I hope this blog will help any of you that have been struggling with just such a situation.  If you have any tips or suggestions you would like to share, I’d be happy to hear them, and I’m sure other people would, as well.  Until next time…

 Note:  Our Linux guru graciously gave me his permission to share his suggestions with all of you.





For more complete information about compiler optimizations, see our Optimization Notice.