Dear all, I have recently installed OpenMPI 1.1.2 on a OpenSSI cluster running Fedora core 3. I tested a simple hello world mpi program (attached) and it runs ok as root. However, if I run the same program under normal user, it gives the following error:
[eddie@oceanus:~/home2/mpi_tut]$ mpirun -np 2 tut01 [oceanus:125089] mca_common_sm_mmap_init: ftruncate failed with errno=13 [oceanus:125089] mca_mpool_sm_init: unable to create shared memory mapping ( /tmp/openmpi-sessions-eddie@localhost_0/default-universe/1/shared_mem_pool.localhost ) -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): PML add procs failed --> Returned "Out of resource" (-2) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** before MPI was initialized *** MPI_ERRORS_ARE_FATAL (goodbye) [eddie@oceanus:~/home2/mpi_tut]$ Am I need to give certain permission to the user in order to oversubscribe processes? Thanks in advance, Eddie.
ompi-output.tar.gz
Description: GNU Zip compressed data