SMPD : create process failed

SMPD : create process failed

Hi,

I'm actually trying to parallylize my fortran code under Composer XE 2013 with the Intel MPI library. I'm using to create my processors the single multiple purpose Daemon (SMPD).

The problem is : when I call a simple subroutine in the MPI code, I get this kind of error :

CreateProcess failed, error 2

unable to start the local smpd manager.

CreateProcess failed, error 2

CreateProcess failed, error 2

unable to start the local smpd manager.

CreateProcess failed, error 2

unable to start the local smpd manager.

unable to start the local smpd manager.

invalid command received, unable to determine the destination: 'cmd=result src=0

 dest=1 tag=9 cmd_tag=9 ctx_key=0 result=SUCCESS '

invalid command received, unable to determine the destination: 'cmd=result src=0

 dest=1 tag=10 cmd_tag=9 ctx_key=2 result=SUCCESS '

invalid command received, unable to determine the destination: 'cmd=result src=0

 dest=1 tag=11 cmd_tag=9 ctx_key=1 result=SUCCESS '

invalid command received, unable to determine the destination: 'cmd=result src=0

 dest=1 tag=12 cmd_tag=9 ctx_key=3 result=SUCCESS '

mpiexec aborting job...

 

Once the call of the subroutine commented, the code works well.

In the project properties, I telled the compiler to de reentrant code (code generation -> generate reentrant code ->threaded) but the code still give the same error.

Can you help me please to understand this error ?

6 post / 0 nuovi
Ultimo contenuto
Per informazioni complete sulle ottimizzazioni del compilatore, consultare l'Avviso sull'ottimizzazione
Ritratto di James Tullos (Intel)

Can you provide a sample code for the problem?  How are you trying to launch the program?

Sincerely,
James Tullos
Technical Consulting Engineer
Intel® Cluster Tools

Thank you for the answer 

I get the error :

CreateProcess failed, error 2

unable to start the local smpd manager.

invalid command received, unable to determine the destination: 'cmd=result src=0

 dest=1 tag=9 cmd_tag=9 ctx_key=0 result=SUCCESS '

when I call a simple subroutine with an MPI parallelized code.

for example : 

Call MPI_INIT (Ierror)
Call MPI_COMM_SIZE (MPI_COMM_WORLD,nbre_processus,Ierror)
Call MPI_COMM_RANK (MPI_COMM_WORLD,rang,Ierror)

......

N = N + 1            
Call EntierAleatoire(1,Nb_arbres_type,choix)       
dimensions_diffs(3*(N-1)+1) = dim_troncs_types(3*(choix-1)+1)

...

Call MPI_FINALIZE (Ierror)

when : 

SUBROUTINE EntierAleatoire(int_a,int_b, int_al)
USE Initialisation_types
USE mkl95_precision
USE MPI

Implicit none

integer, INTENT(IN) :: int_a
integer, INTENT(IN) :: int_b
integer, INTENT(OUT) :: int_al
integer :: init_al
Real :: a

Common /Aleatoire/ init_al
If (init_al == 0) Then 
    call RANDOM_SEED 
    init_al = 1
Endif 
call DATE_AND_TIME
call random_number(harvest=a)
int_al = INT(a * (int_b + 1 - int_a)) + int_a

End Subroutine EntierAleatoire

I run my code by using the command : " mpiexec.exe -n 4  x64\Debug\myCode.exe"   with the simple multiple purpose Daemon (SMPD) 

If I comment the subroutine call, the code works very well. I get the same error with other subroutines. So I think the problem is caused by the subroutine call with the MPI environment ....

In the project properties I mentioned generate reentrant code ->threaded. 

Can you please help me to find an explanation to this problem ?

 

                

Ritratto di James Tullos (Intel)

What other compilation options are you using?  Can you compile your code via command line?

If I use the MPIEXEC wrapper, I generate exactely this command : 

"C:\Program Files\MPICH2\bin\mpiexec.exe" -n 1  -noprompt "C:\Users\Ines\Desktop\CODES WINDOWS\Code Fortran MPI Rapide 28-11-2013\Code_Modelisation_foret\Code_Modelisation_foret\x64\Debug\Code_Modelisation_foret.exe"

and I get this error :

CreateProcess failed, error 2
unable to start the local smpd manager.
invalid command received, unable to determine the destination: 'cmd=result src=0 dest=1 tag=3 cmd_tag=2 ctx_key=0 result=SUCCESS '

If I comment this line  : Call EntierAleatoire(1,Nb_arbres_type,choix), with the same command, the code works well !

 

Ritratto di James Tullos (Intel)

Please try using the mpiexec wrapper provided with the Intel® MPI Library.  If you run the file

C:Program Files (x86)IntelMPI4.1.3.045em64tbinmpivars.bat

This will set your environment variables to point to the correct launcher and libraries.  Also, try using mpiexec.hydra instead of mpiexec and see if that works.

Sincerely,
James Tullos
Technical Consulting Engineer
Intel® Cluster Tools

Accedere per lasciare un commento.