Hello,

I have compiled ompi and another program for use on another rhel5/x86_64 
machine, after transfering the binaries and setting up environment variables is 
there anything else I need to do for ompi to run properly? When executing my 
prog I get:
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

ompi_info gives:

                 Package: Open MPI root@adduct1 Distribution
                Open MPI: 1.3
   Open MPI SVN revision: r20295
   Open MPI release date: Jan 19, 2009
                Open RTE: 1.3
   Open RTE SVN revision: r20295
   Open RTE release date: Jan 19, 2009
                    OPAL: 1.3
       OPAL SVN revision: r20295
       OPAL release date: Jan 19, 2009
            Ident string: 1.3
                  Prefix: /opt/openmpi-1.3_static/intel
 Configured architecture: x86_64-unknown-linux-gnu
          Configure host: adduct1
           Configured by: root
           Configured on: Tue Mar 10 17:57:14 PDT 2009
          Configure host: adduct1
                Built by: Ben
                Built on: Tue Mar 10 18:11:01 PDT 2009
              Built host: adduct1
              C bindings: yes
            C++ bindings: yes
      Fortran77 bindings: yes (all)
      Fortran90 bindings: yes
 Fortran90 bindings size: small
              C compiler: icc
     C compiler absolute: /opt/intel/Compiler/11.0/081/bin/intel64/icc
            C++ compiler: icpc
   C++ compiler absolute: /opt/intel/Compiler/11.0/081/bin/intel64/icpc
      Fortran77 compiler: ifort
  Fortran77 compiler abs: /opt/intel/Compiler/11.0/081/bin/intel64/ifort
      Fortran90 compiler: ifort
  Fortran90 compiler abs: /opt/intel/Compiler/11.0/081/bin/intel64/ifort
             C profiling: yes
           C++ profiling: yes
     Fortran77 profiling: yes
     Fortran90 profiling: yes
          C++ exceptions: no
          Thread support: posix (mpi: no, progress: no)
           Sparse Groups: no
  Internal debug support: no
     MPI parameter check: runtime
Memory profiling support: no
Memory debugging support: no
         libltdl support: yes
   Heterogeneous support: no
 mpirun default --prefix: no
         MPI I/O support: yes
       MPI_WTIME support: gettimeofday
Symbol visibility support: yes
   FT Checkpoint support: no  (checkpoint thread: no)
           MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.3)
              MCA memory: ptmalloc2 (MCA v2.0, API v2.0, Component v1.3)
           MCA paffinity: linux (MCA v2.0, API v2.0, Component v1.3)
               MCA carto: auto_detect (MCA v2.0, API v2.0, Component v1.3)
               MCA carto: file (MCA v2.0, API v2.0, Component v1.3)
           MCA maffinity: first_use (MCA v2.0, API v2.0, Component v1.3)
               MCA timer: linux (MCA v2.0, API v2.0, Component v1.3)
         MCA installdirs: env (MCA v2.0, API v2.0, Component v1.3)
         MCA installdirs: config (MCA v2.0, API v2.0, Component v1.3)
                 MCA dpm: orte (MCA v2.0, API v2.0, Component v1.3)
              MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.3)
           MCA allocator: basic (MCA v2.0, API v2.0, Component v1.3)
           MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.3)
                MCA coll: basic (MCA v2.0, API v2.0, Component v1.3)
                MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.3)
                MCA coll: inter (MCA v2.0, API v2.0, Component v1.3)
                MCA coll: self (MCA v2.0, API v2.0, Component v1.3)
                MCA coll: sm (MCA v2.0, API v2.0, Component v1.3)
                MCA coll: tuned (MCA v2.0, API v2.0, Component v1.3)
                  MCA io: romio (MCA v2.0, API v2.0, Component v1.3)
               MCA mpool: fake (MCA v2.0, API v2.0, Component v1.3)
               MCA mpool: rdma (MCA v2.0, API v2.0, Component v1.3)
               MCA mpool: sm (MCA v2.0, API v2.0, Component v1.3)
                 MCA pml: cm (MCA v2.0, API v2.0, Component v1.3)
                 MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.3)
                 MCA pml: v (MCA v2.0, API v2.0, Component v1.3)
                 MCA bml: r2 (MCA v2.0, API v2.0, Component v1.3)
              MCA rcache: vma (MCA v2.0, API v2.0, Component v1.3)
                 MCA btl: self (MCA v2.0, API v2.0, Component v1.3)
                 MCA btl: sm (MCA v2.0, API v2.0, Component v1.3)
                 MCA btl: tcp (MCA v2.0, API v2.0, Component v1.3)
                MCA topo: unity (MCA v2.0, API v2.0, Component v1.3)
                 MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.3)
                 MCA osc: rdma (MCA v2.0, API v2.0, Component v1.3)
                 MCA iof: hnp (MCA v2.0, API v2.0, Component v1.3)
                 MCA iof: orted (MCA v2.0, API v2.0, Component v1.3)
                 MCA iof: tool (MCA v2.0, API v2.0, Component v1.3)
                 MCA oob: tcp (MCA v2.0, API v2.0, Component v1.3)
                MCA odls: default (MCA v2.0, API v2.0, Component v1.3)
                 MCA ras: slurm (MCA v2.0, API v2.0, Component v1.3)
               MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.3)
               MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.3)
               MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.3)
                 MCA rml: oob (MCA v2.0, API v2.0, Component v1.3)
              MCA routed: binomial (MCA v2.0, API v2.0, Component v1.3)
              MCA routed: direct (MCA v2.0, API v2.0, Component v1.3)
              MCA routed: linear (MCA v2.0, API v2.0, Component v1.3)
                 MCA plm: rsh (MCA v2.0, API v2.0, Component v1.3)
                 MCA plm: slurm (MCA v2.0, API v2.0, Component v1.3)
               MCA filem: rsh (MCA v2.0, API v2.0, Component v1.3)
             
I hope this has not been asked and answered a bunch of time already, I couldnt 
find the answer though.

Ben

_________________________________________________________________
Windows Liveā„¢ Groups: Create an online spot for your favorite groups to meet.
http://windowslive.com/online/groups?ocid=TXT_TAGLM_WL_groups_032009

Reply via email to