Running Intel MPI on InfiniBand

Running Intel MPI on InfiniBand

Cluster of 2 is setup “only” over InfiniBand (IPoIB) (no Ethernet configured). The IP communication like ping, ssh is functioning fine. Password-less ssh to node happening fine. When the command is given like: mpiexec.hydra -np 16 -iface ib0 -f cluster_list ./main.out -ppn 8 where cluster_list contains the lines node01 node02 we immediately get the following error: forrtl: severe (174): SIGSEGV, segmentation fault occurred However, if I comment “any one” node name in cluster_list file like: # node01 node02 or node01 # node02 or mpiexec.hydra -np 16 -iface ib0 -host node02 ./main.out -ppn 8 or mpiexec.hydra -np 16 -iface ib0 -host node01 ./main.out -ppn 8 and run the same command, we encounter “no errors” and the full runtime is achieved. My configuration ============= I’m using IPoIB only IB connection (Mellanox DDR HCA and SilverStorm Switch). Other application includes: Intel Fortran Composer XE 2013.3.163, Intel MPI 4.1.1.036, NetCDF 4.0.0, FFT 3.3.3, Intel OFED It is worth noting that only when two nodes provided, the error occurs. I can run mpiexec.hydra on one node (i.e. node01) and provide -host node02 in the command-line, and the process runs fine. Please provide your suggestion to eliminate this error. Thanks Girish Nair girish@njgroup.net (girish at njgroup dot net)

Director Supports
NJ Dataprint Private Limited
5 posts / 0 nouveau(x)
Dernière contribution
Reportez-vous à notre Notice d'optimisation pour plus d'informations sur les choix et l'optimisation des performances dans les produits logiciels Intel.

There's a separate forum on cluster & HPC where such questions are topical.  I wonder why you aren't using mpirun, since you have an MPI version where that appears to have replaced some former usage of mpiexec.hydra.

Hi Tim,
Thanks for your response.

Well I'm a new member. I apologize if I've broken protocol. In fact I was really looking for Cluster and HPC forum. Unfortunately could not find it. Can you guide me with the URL link.

I was initially using mpirun before turning to mpiexec.hydra as per some advise from few HPC veterans.

- Girish Nair

Director Supports
NJ Dataprint Private Limited

Thanks a ton Tim.

Great! appreciate your time and effort. Would go through the URL pointed out by you.

Good Day!

Director Supports
NJ Dataprint Private Limited

Laisser un commentaire

Veuillez ouvrir une session pour ajouter un commentaire. Pas encore membre ? Rejoignez-nous dès aujourd’hui