MPI_Comm_connect issue

MPI_Comm_connect issue

Hi ,

i am trying to connect processes running as clients and a server .

I run the server as mpiexec -n 1  -genv I_MPI_FABRICS shm:tcp ./ConsoleApplication1.exe

and the clients as mpiexec -n 2 .\ConsoleApplication2.exe

The code for server is as shown below

int num_errors = 0;

int rank, size,again;

char port_name[MPI_MAX_PORT_NAME] , pname2[MPI_MAX_PORT_NAME];

MPI_Status status;

//MPI_Comm comm1, comm2;

MPI_Comm client;

int data = 0;

double buf[MAX_DATA];

 

MPI_Init(NULL, NULL);

MPI_Comm_size(MPI_COMM_WORLD, &size);

MPI_Comm_rank(MPI_COMM_WORLD, &rank);

if (rank == 0)

{

MPI_Open_port(MPI_INFO_NULL, port_name);

 

while (1) {

MPI_Comm_accept(port_name, MPI_INFO_NULL, 0, MPI_COMM_WORLD,

&client);

again = 1;

while (again) {

MPI_Request rq;

MPI_Recv(buf, MAX_DATA, MPI_DOUBLE,

MPI_ANY_SOURCE, MPI_ANY_TAG, client, &status);

//MPI_Wait(&rq, &status);

switch (status.MPI_TAG) {

case 0: MPI_Comm_free(&client);

MPI_Close_port(port_name);

MPI_Finalize();

return 0;

case 1: MPI_Comm_disconnect(&client);

again = 0;

break;

case 2: printf("value received%lf", buf[0]);

break;

default:

/* Unexpected message type */

MPI_Abort(MPI_COMM_WORLD, 1);

}

}

}

}

MPI_Finalize();

return 0;

 

and the code for client ( a test code only ) is as shown below 

 

MPI_Comm server;

double buf[MAX_DATA];

char port_name[MPI_MAX_PORT_NAME];

int tag = 2 , n= MAX_DATA;

int rank, size;

 

MPI_Init(&argc, &argv);

int prov;

 

//MPI_Init_thread(NULL, NULL, MPI_THREAD_MULTIPLE, &prov);

MPI_Comm_size(MPI_COMM_WORLD, &size);

MPI_Comm_rank(MPI_COMM_WORLD, &rank);

 

ifstream ifs("E:/mpi_port_name.txt", ios_base::in);

ifs.getline(port_name, MPI_MAX_PORT_NAME + 1);

ifs.close();

 

MPI_Comm_connect(port_name, MPI_INFO_NULL, 0, MPI_COMM_SELF,

&server);

 

////while (!done) {

// tag = 2; /* Action to perform */

// buf[0] = 3.1415926;

// MPI_Send(buf, n, MPI_DOUBLE, 0, tag, server);

// /* etc */

////}

//MPI_Send(buf, 0, MPI_DOUBLE, 0, 1, server);

//MPI_Comm_disconnect(&server);

MPI_Finalize();

return 0;

 

Now the problem i am facing is that the client code hangs in the MPI_Comm_connect call whereas if i run a single client the connection is established properly.

 

My problem is i have to connect more than one process of a job with another process dynamically . I am new to MPI so any help will be much appreciated .

 

Regards

Ujwal

 

 

 

2 posts / 0 new
Last post
For more complete information about compiler optimizations, see our Optimization Notice.

This question would be more likely to get attention on the companion forum on hpc and cluster computing. 

Leave a Comment

Please sign in to add a comment. Not a member? Join today