Hello. I have read a couple of posts on Stackoverflow that seem to imply that independently started processes should be able to communicate via ompi-server, but I have not been able to accomplish this. Is this actually possible? Here is an outline of what I have done and the error I am receiving. I'm using version 1.8.1.
> ompi-server --no-daemonize -r + 78512128.0;tcp://192.168.1.219:50443 > mpirun -np 1 --hostfile ~/mpi-hosts --ompi-server "78512128.0;tcp:// 192.168.1.219:50443" -v port_server Master of rank 0 of 1 on kurenai Master port is 488308736.0;tcp://192.168.1.219:60050 +488308737.0;tcp://192.168.1.219:48993:300 port_server code outline: char port[MPI::MAX_PORT_NAME]; MPI::Intercomm intercomm; MPI::Info info; int buffer[3]; MPI::Open_port(MPI::INFO_NULL, port); printf("Master port is %s\n", port); info = MPI::Info::Create(); info.Set("ompi_global_scope", "true"); MPI::Publish_name("test_service", info, port); intercomm = MPI::COMM_WORLD.Accept(port, MPI::INFO_NULL, 0); buffer[0] = 1; buffer[1] = 2; buffer[2] = 3; intercomm.Send(buffer, 3, MPI::INT, 0, PARENT_TAG); ... > mpirun -np 1 --hostfile ~/mpi-hosts --ompi-server "78512128.0;tcp:// 192.168.1.219:50443" -v port_client barronj@192.168.1.219's password: Slave of rank 0 of 1 on athena Slave found test_service on port, 488308736.0;tcp://192.168.1.219:60050 +488308737.0;tcp://192.168.1.219:48993:300 [athena:08054] [[27027,0],0]-[[7451,0],0] mca_oob_tcp_peer_send_handler: invalid connection state (6) on socket 19 client code outline: char port[MPI::MAX_PORT_NAME]; MPI::Intercomm intercomm; int buffer[3]; MPI::Lookup_name("test_service", MPI::INFO_NULL, port); printf("Slave found test_service on port, %s\n", port); intercomm = MPI::COMM_WORLD.Connect(port, MPI::INFO_NULL, 0); intercomm.Recv(buffer, 3, MPI::INT, 0, PARENT_TAG); ... >From testing, it looks like it is happening within Connect. What am I doing incorrectly? Thank you, James Barron