The command: 'ompi_info --param btl openib' doesnt return any openib
component.

When I try to use command like: ' mpirun *--mca btl self,sm,openib* ...'
it throws an error:
--------------------------------------------------------------------------
A requested component was not found, or was unable to be opened.  This
means that this component is either not installed or is unable to be
used on your system (e.g., sometimes this means that shared libraries
that the component requires are unable to be found/loaded).  Note that
Open MPI stopped checking at the first component that it did not find.

Host:      polaris
Framework: btl
Component: openib
--------------------------------------------------------------------------

Regards,
Chaitra




On Wed, Jul 30, 2014 at 2:40 PM, Ralph Castain <r...@open-mpi.org> wrote:

> According to your output, you *do* have the IB components available:
>
>                  MCA btl: openib (MCA v2.0, API v2.0, Component v1.8.1)
>
>
> What made you think that you don't have them?
>
>
> On Jul 30, 2014, at 12:10 AM, Chaitra Kumar <chaitragku...@gmail.com>
> wrote:
>
> Hi Howard,
>
> The attached file "config,out" has the output of configure.
>
> Output of ompi_info command:
>                  Package: Open MPI padmanac@polaris-4 Distribution
>                 Open MPI: 1.8.1
>   Open MPI repo revision: r31483
>    Open MPI release date: Apr 22, 2014
>                 Open RTE: 1.8.1
>   Open RTE repo revision: r31483
>    Open RTE release date: Apr 22, 2014
>                     OPAL: 1.8.1
>       OPAL repo revision: r31483
>        OPAL release date: Apr 22, 2014
>                  MPI API: 3.0
>             Ident string: 1.8.1
>                   Prefix: /home/padmanac/openmpi181
>  Configured architecture: x86_64-unknown-linux-gnu
>           Configure host: polaris-4
>            Configured by: padmanac
>            Configured on: Tue Jul 29 11:41:12 PDT 2014
>           Configure host: polaris-4
>                 Built by: padmanac
>                 Built on: Tue Jul 29 11:57:53 PDT 2014
>               Built host: polaris-4
>               C bindings: yes
>             C++ bindings: yes
>              Fort mpif.h: yes (all)
>             Fort use mpi: yes (limited: overloading)
>        Fort use mpi size: deprecated-ompi-info-value
>         Fort use mpi_f08: no
>  Fort mpi_f08 compliance: The mpi_f08 module was not built
>   Fort mpi_f08 subarrays: no
>            Java bindings: no
>   Wrapper compiler rpath: runpath
>               C compiler: gcc
>      C compiler absolute: /opt/gcc/bin/gcc
>   C compiler family name: GNU
>       C compiler version: 4.8.2
>             C++ compiler: g++
>    C++ compiler absolute: /opt/gcc/bin/g++
>            Fort compiler: gfortran
>        Fort compiler abs: /opt/gcc/bin/gfortran
>          Fort ignore TKR: no
>    Fort 08 assumed shape: no
>       Fort optional args: no
>       Fort BIND(C) (all): no
>       Fort ISO_C_BINDING: no
>  Fort SUBROUTINE BIND(C): no
>        Fort TYPE,BIND(C): no
>  Fort T,BIND(C,name="a"): no
>             Fort PRIVATE: no
>           Fort PROTECTED: no
>            Fort ABSTRACT: no
>        Fort ASYNCHRONOUS: no
>           Fort PROCEDURE: no
>  Fort f08 using wrappers: no
>              C profiling: yes
>            C++ profiling: yes
>    Fort mpif.h profiling: yes
>   Fort use mpi profiling: yes
>    Fort use mpi_f08 prof: no
>           C++ exceptions: no
>           Thread support: posix (MPI_THREAD_MULTIPLE: no, OPAL support:
> yes,
>                           OMPI progress: no, ORTE progress: yes, Event lib:
>                           yes)
>            Sparse Groups: no
>   Internal debug support: no
>   MPI interface warnings: yes
>      MPI parameter check: runtime
> Memory profiling support: no
> Memory debugging support: no
>          libltdl support: yes
>    Heterogeneous support: no
>  mpirun default --prefix: no
>          MPI I/O support: yes
>        MPI_WTIME support: gettimeofday
>      Symbol vis. support: yes
>    Host topology support: yes
>           MPI extensions:
>    FT Checkpoint support: no (checkpoint thread: no)
>    C/R Enabled Debugging: no
>      VampirTrace support: yes
>   MPI_MAX_PROCESSOR_NAME: 256
>     MPI_MAX_ERROR_STRING: 256
>      MPI_MAX_OBJECT_NAME: 64
>         MPI_MAX_INFO_KEY: 36
>         MPI_MAX_INFO_VAL: 256
>        MPI_MAX_PORT_NAME: 1024
>   MPI_MAX_DATAREP_STRING: 128
>            MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.8.1)
>             MCA compress: bzip (MCA v2.0, API v2.0, Component v1.8.1)
>             MCA compress: gzip (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA crs: none (MCA v2.0, API v2.0, Component v1.8.1)
>                   MCA db: hash (MCA v2.0, API v1.0, Component v1.8.1)
>                   MCA db: print (MCA v2.0, API v1.0, Component v1.8.1)
>                MCA event: libevent2021 (MCA v2.0, API v2.0, Component
> v1.8.1)
>                MCA hwloc: hwloc172 (MCA v2.0, API v2.0, Component v1.8.1)
>                   MCA if: posix_ipv4 (MCA v2.0, API v2.0, Component v1.8.1)
>                   MCA if: linux_ipv6 (MCA v2.0, API v2.0, Component v1.8.1)
>          MCA installdirs: env (MCA v2.0, API v2.0, Component v1.8.1)
>          MCA installdirs: config (MCA v2.0, API v2.0, Component v1.8.1)
>               MCA memory: linux (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA pstat: linux (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA sec: basic (MCA v2.0, API v1.0, Component v1.8.1)
>                MCA shmem: mmap (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA shmem: posix (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA shmem: sysv (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA timer: linux (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA dfs: app (MCA v2.0, API v1.0, Component v1.8.1)
>                  MCA dfs: orted (MCA v2.0, API v1.0, Component v1.8.1)
>                  MCA dfs: test (MCA v2.0, API v1.0, Component v1.8.1)
>               MCA errmgr: default_app (MCA v2.0, API v3.0, Component
> v1.8.1)
>               MCA errmgr: default_hnp (MCA v2.0, API v3.0, Component
> v1.8.1)
>               MCA errmgr: default_orted (MCA v2.0, API v3.0, Component
>                           v1.8.1)
>               MCA errmgr: default_tool (MCA v2.0, API v3.0, Component
> v1.8.1)
>                  MCA ess: env (MCA v2.0, API v3.0, Component v1.8.1)
>                  MCA ess: hnp (MCA v2.0, API v3.0, Component v1.8.1)
>                  MCA ess: singleton (MCA v2.0, API v3.0, Component v1.8.1)
>                  MCA ess: slurm (MCA v2.0, API v3.0, Component v1.8.1)
>                  MCA ess: tool (MCA v2.0, API v3.0, Component v1.8.1)
>                MCA filem: raw (MCA v2.0, API v2.0, Component v1.8.1)
>              MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA iof: hnp (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA iof: mr_hnp (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA iof: mr_orted (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA iof: orted (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA iof: tool (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA odls: default (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA oob: tcp (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA plm: isolated (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA plm: rsh (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA plm: slurm (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA ras: loadleveler (MCA v2.0, API v2.0, Component
> v1.8.1)
>                  MCA ras: simulator (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA ras: slurm (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA rmaps: lama (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA rmaps: mindist (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA rmaps: ppr (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA rmaps: resilient (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA rmaps: round_robin (MCA v2.0, API v2.0, Component
> v1.8.1)
>                MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA rmaps: staged (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA rml: oob (MCA v2.0, API v2.0, Component v1.8.1)
>               MCA routed: binomial (MCA v2.0, API v2.0, Component v1.8.1)
>               MCA routed: debruijn (MCA v2.0, API v2.0, Component v1.8.1)
>               MCA routed: direct (MCA v2.0, API v2.0, Component v1.8.1)
>               MCA routed: radix (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA state: app (MCA v2.0, API v1.0, Component v1.8.1)
>                MCA state: hnp (MCA v2.0, API v1.0, Component v1.8.1)
>                MCA state: novm (MCA v2.0, API v1.0, Component v1.8.1)
>                MCA state: orted (MCA v2.0, API v1.0, Component v1.8.1)
>                MCA state: staged_hnp (MCA v2.0, API v1.0, Component v1.8.1)
>                MCA state: staged_orted (MCA v2.0, API v1.0, Component
> v1.8.1)
>                MCA state: tool (MCA v2.0, API v1.0, Component v1.8.1)
>            MCA allocator: basic (MCA v2.0, API v2.0, Component v1.8.1)
>            MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA bcol: basesmuma (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA bcol: ptpcoll (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA bml: r2 (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA btl: openib (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA btl: self (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA btl: sm (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA btl: tcp (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA btl: vader (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA coll: basic (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA coll: inter (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA coll: libnbc (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA coll: ml (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA coll: self (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA coll: sm (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA coll: tuned (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA dpm: orte (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA fbtl: posix (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA fcoll: dynamic (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA fcoll: individual (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA fcoll: static (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA fcoll: two_phase (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA fcoll: ylib (MCA v2.0, API v2.0, Component v1.8.1)
>                   MCA fs: ufs (MCA v2.0, API v2.0, Component v1.8.1)
>                   MCA io: ompio (MCA v2.0, API v2.0, Component v1.8.1)
>                   MCA io: romio (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA mpool: grdma (MCA v2.0, API v2.0, Component v1.8.1)
>                MCA mpool: sm (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA mtl: psm (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA osc: rdma (MCA v2.0, API v3.0, Component v1.8.1)
>                  MCA osc: sm (MCA v2.0, API v3.0, Component v1.8.1)
>                  MCA pml: v (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA pml: bfo (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA pml: cm (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.8.1)
>               MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.8.1)
>               MCA rcache: vma (MCA v2.0, API v2.0, Component v1.8.1)
>                  MCA rte: orte (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA sbgp: basesmsocket (MCA v2.0, API v2.0, Component
> v1.8.1)
>                 MCA sbgp: basesmuma (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA sbgp: p2p (MCA v2.0, API v2.0, Component v1.8.1)
>             MCA sharedfp: individual (MCA v2.0, API v2.0, Component v1.8.1)
>             MCA sharedfp: lockedfile (MCA v2.0, API v2.0, Component v1.8.1)
>             MCA sharedfp: sm (MCA v2.0, API v2.0, Component v1.8.1)
>                 MCA topo: basic (MCA v2.0, API v2.1, Component v1.8.1)
>            MCA vprotocol: pessimist (MCA v2.0, API v2.0, Component v1.8.1)
>
>
>
> The command  'rpm -qa | grep ibverbs' lists following libraries.
> libibverbs-devel-static-1.1.7-1.x86_64
> libibverbs-devel-1.1.7-1.x86_64
> libibverbs-1.1.7-1.x86_64
> libibverbs-debuginfo-1.1.7-1.x86_64
> libibverbs-utils-1.1.7-1.x86_64
>
> Please let me know what i am missing.
>
> Regards,
> Chaitra
>
>
> On Wed, Jul 30, 2014 at 8:13 AM, Howard Pritchard <hpprit...@gmail.com>
> wrote:
>
>> Hi Chaitra,
>>
>> Could you send the output from your configure and output from ompi_info?
>> Could you also send the output from the node where you are building ompi
>> of
>>
>> rpm -qa | grep ibverbs
>>
>> If this command indicates an libibverbs-devel was installed on the system,
>> you should check to see if it was installed in the default location or
>> for some
>> reason was relocated.  If you don't see that a libibverbs-devel rpm was
>> installed,
>> then you need a sysadmin to install it.
>>
>>
>>
>>
>> 2014-07-29 19:35 GMT-06:00 Chaitra Kumar <chaitragku...@gmail.com>:
>>
>>> Hi Team,
>>>
>>> I am trying to setup openmpi 1.8.1 on a system with infiniband.
>>>
>>> Am using the default configure options. I am not using any
>>> multithreading option.
>>>
>>> After installation, no openib components are available.
>>>
>>>
>>> I tried even with flag: '--with-verbs', still no use.
>>>
>>>
>>> Should i use any other flag to enable openib? am I missing any step?
>>>
>>> Regards,
>>> Chaitra
>>>
>>> _______________________________________________
>>> users mailing list
>>> us...@open-mpi.org
>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>> Link to this post:
>>> http://www.open-mpi.org/community/lists/users/2014/07/24889.php
>>>
>>
>>
>> _______________________________________________
>> users mailing list
>> us...@open-mpi.org
>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>> Link to this post:
>> http://www.open-mpi.org/community/lists/users/2014/07/24891.php
>>
>
> <config.out>_______________________________________________
>
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post:
> http://www.open-mpi.org/community/lists/users/2014/07/24892.php
>
>
>
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post:
> http://www.open-mpi.org/community/lists/users/2014/07/24893.php
>

Reply via email to