Hi, 
I am trying to run osu mpi benchmark tests on Infiniband setup (connected
back-to-back via Mellanox hw).  I am using the below command
"mpirun --prefix /usr/local/ -np 2 --mca btl openib,self -H 192.168.4.91 -H
192.168.4.92 --mca orte_base_help_aggregate 0 --mca btl_openib_cpc_include oob
/root/osu_benchmarks-3.1.1/osu_latency
"
But I am getting the error as
"[Isengard:05030] *** An error occurred in MPI_Barrier
[Isengard:05030] *** on communicator MPI_COMM_WORLD
[Isengard:05030] *** MPI_ERR_IN_STATUS: error code in status
[Isengard:05030] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
[Rohan:05010] *** An error occurred in MPI_Barrier
[Rohan:05010] *** on communicator MPI_COMM_WORLD
[Rohan:05010] *** MPI_ERR_IN_STATUS: error code in status
[Rohan:05010] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
"

Am I missing anything in the above command ? Please suggest me.

Regards,
Ramu

Reply via email to