The bindings in v3.1.0 are incorrect. They are missing the asynchronous 
attribute. That will be fixed in v3.1.1. 



> On Jun 6, 2018, at 12:06 PM, Siegmar Gross 
> <siegmar.gr...@informatik.hs-fulda.de> wrote:
> 
> Hi Jeff,
> 
>> I asked some Fortran gurus, and they don't think that there
>> is any restriction on having ASYNCHRONOUS and INTENT on the
>> same line.  Indeed, Open MPI's definition of MPI_ACCUMULATE
>> seems to agree with what is in MPI-3.1.
>> Is this a new version of a Fortran compiler that you're
>> using, perchance?  I.e., is this a compiler bug?
> 
> No, I use the compiler for nearly a year now and I don't
> know if it is a compiler bug.
> 
> loki opt 107 cc -V
> cc: Studio 12.6 Sun C 5.15 Linux_i386 2017/05/30
> 
> 
> I was able to build Open MPI 3.1.0 with the compiler.
> 
> loki fd1026 101 ompi_info|more
>                 Package: Open MPI root@loki Distribution
>                Open MPI: 3.1.0
>  Open MPI repo revision: v3.1.0
>   Open MPI release date: May 07, 2018
>                Open RTE: 3.1.0
>  Open RTE repo revision: v3.1.0
>   Open RTE release date: May 07, 2018
>                    OPAL: 3.1.0
>      OPAL repo revision: v3.1.0
>       OPAL release date: May 07, 2018
>                 MPI API: 3.1.0
>            Ident string: 3.1.0
>                  Prefix: /usr/local/openmpi-3.1.0_64_cc
> Configured architecture: x86_64-unknown-linux-gnu
>          Configure host: loki
>           Configured by: root
>           Configured on: Tue May  8 09:10:43 CEST 2018
>          Configure host: loki
>  Configure command line: '--prefix=/usr/local/openmpi-3.1.0_64_cc' 
> '--libdir=/u
> sr/local/openmpi-3.1.0_64_cc/lib64' '--with-jdk-bindir=/usr/local/jdk-10/bin' 
> '-
> -with-jdk-headers=/usr/local/jdk-10/include' 'JAVA_HOME=/usr/local/jdk-10' 
> 'LDFL
> AGS=-m64 -mt -Wl,-z -Wl,noexecstack -L/usr/local/lib64' 'CC=cc' 'CXX=CC' 
> 'FC=f95
> ' 'CFLAGS=-m64 -mt -I/usr/local/include' 'CXXFLAGS=-m64 -I/usr/local/include' 
> 'F
> CFLAGS=-m64' 'CPP=cpp -I/usr/local/include' 'CXXCPP=cpp -I/usr/local/include' 
> '-
> -enable-mpi-cxx' '--enable-cxx-exceptions' '--enable-mpi-java' 
> '--with-valgrind=
> /usr/local/valgrind' '--with-hwloc=internal' '--without-verbs' 
> '--with-wrapper-c
> flags=-m64 -mt' '--with-wrapper-cxxflags=-m64' '--with-wrapper-fcflags=-m64' 
> '--
> with-wrapper-ldflags=-mt' '--enable-debug'
>                Built by: root
>                Built on: Tue May  8 09:20:42 CEST 2018
>              Built host: loki
>              C bindings: yes
>            C++ bindings: yes
>             Fort mpif.h: yes (all)
>            Fort use mpi: yes (full: ignore TKR)
>       Fort use mpi size: deprecated-ompi-info-value
>        Fort use mpi_f08: yes
> Fort mpi_f08 compliance: The mpi_f08 module is available, but due to 
> limitation
> s in the f95 compiler and/or Open MPI, does not support the following: array 
> sub
> sections, Fortran '08-specified ASYNCHRONOUS behavior, direct passthru (where 
> po
> ssible) to underlying Open MPI's C functionality
>  Fort mpi_f08 subarrays: no
>           Java bindings: yes
>  Wrapper compiler rpath: runpath
>              C compiler: cc
>     C compiler absolute: /opt/solstudio12.6/bin/cc
>  C compiler family name: SUN
>      C compiler version: 0x5150
>            C++ compiler: CC
>   C++ compiler absolute: /opt/solstudio12.6/bin/CC
>           Fort compiler: f95
>       Fort compiler abs: /opt/solstudio12.6/bin/f95
>         Fort ignore TKR: yes (!$PRAGMA IGNORE_TKR)
>   Fort 08 assumed shape: no
>      Fort optional args: yes
>          Fort INTERFACE: yes
>    Fort ISO_FORTRAN_ENV: yes
>       Fort STORAGE_SIZE: yes
>      Fort BIND(C) (all): yes
>      Fort ISO_C_BINDING: yes
> Fort SUBROUTINE BIND(C): yes
>       Fort TYPE,BIND(C): yes
> Fort T,BIND(C,name="a"): yes
>            Fort PRIVATE: yes
>          Fort PROTECTED: yes
>           Fort ABSTRACT: yes
>       Fort ASYNCHRONOUS: no
>          Fort PROCEDURE: yes
>         Fort USE...ONLY: yes
>           Fort C_FUNLOC: yes
> Fort f08 using wrappers: yes
>         Fort MPI_SIZEOF: yes
>             C profiling: yes
>           C++ profiling: yes
>   Fort mpif.h profiling: yes
>  Fort use mpi profiling: yes
>   Fort use mpi_f08 prof: yes
>          C++ exceptions: yes
>          Thread support: posix (MPI_THREAD_MULTIPLE: yes, OPAL support: yes, 
> OM
> PI progress: no, ORTE progress: yes, Event lib: yes)
>           Sparse Groups: no
>  Internal debug support: yes
>  MPI interface warnings: yes
>     MPI parameter check: runtime
> Memory profiling support: no
> Memory debugging support: no
>              dl support: yes
>   Heterogeneous support: no
> mpirun default --prefix: no
>       MPI_WTIME support: native
>     Symbol vis. support: yes
>   Host topology support: yes
>          MPI extensions: affinity, cuda
>   FT Checkpoint support: no (checkpoint thread: no)
>   C/R Enabled Debugging: no
>  MPI_MAX_PROCESSOR_NAME: 256
>    MPI_MAX_ERROR_STRING: 256
>     MPI_MAX_OBJECT_NAME: 64
>        MPI_MAX_INFO_KEY: 36
>        MPI_MAX_INFO_VAL: 256
>       MPI_MAX_PORT_NAME: 1024
>  MPI_MAX_DATAREP_STRING: 128
>           MCA allocator: bucket (MCA v2.1.0, API v2.0.0, Component v3.1.0)
> ...
> 
> 
> Some sourcecode files of both versions contain INTENT and ASYNCHRONOUS
> so that I don't know why I can compile one version and cannot compile
> the other one. Do you have any ideas?
> 
> loki src 132 grep ASYNCHRONOUS 
> openmpi-master/openmpi-master-201806060243-64a5baa/*/*/*/*/*/* | & grep -v 
> directory
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/mpif-h/profile/Makefile.in:OMPI_FORTRAN_HAVE_ASYNCHRONOUS
>  = @OMPI_FORTRAN_HAVE_ASYNCHRONOUS@
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/mod/Makefile.in:OMPI_FORTRAN_HAVE_ASYNCHRONOUS
>  = @OMPI_FORTRAN_HAVE_ASYNCHRONOUS@
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/mod/mpi-f08-interfaces.F90:!
>  ASYNCHRONOUS had to removed from the base argument because
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/mod/pmpi-f08-interfaces.F90:!
>  ASYNCHRONOUS had to removed from the base argument because
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/profile/pimrecv_f08.F90:
>    OMPI_FORTRAN_IGNORE_TKR_TYPE OMPI_ASYNCHRONOUS :: buf
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/profile/pirecv_f08.F90:
>    OMPI_FORTRAN_IGNORE_TKR_TYPE OMPI_ASYNCHRONOUS :: buf
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/profile/pirsend_f08.F90:
>    OMPI_FORTRAN_IGNORE_TKR_TYPE, INTENT(IN) OMPI_ASYNCHRONOUS :: buf
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/profile/pisend_f08.F90:
>    OMPI_FORTRAN_IGNORE_TKR_TYPE, INTENT(IN) OMPI_ASYNCHRONOUS :: buf
> openmpi-master/openmpi-master-201806060243-64a5baa/opal/mca/hwloc/hwloc201/hwloc/Makefile.in:OMPI_FORTRAN_HAVE_ASYNCHRONOUS
>  = @OMPI_FORTRAN_HAVE_ASYNCHRONOUS@
> 
> 
> loki src 133 grep ASYNCHRONOUS openmpi-3.1.0/openmpi-3.1.0/*/*/*/*/*/* | & 
> grep -v directory
> openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/mpif-h/profile/Makefile.in:OMPI_FORTRAN_HAVE_ASYNCHRONOUS
>  = @OMPI_FORTRAN_HAVE_ASYNCHRONOUS@
> openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/use-mpi-f08/mod/Makefile.in:OMPI_FORTRAN_HAVE_ASYNCHRONOUS
>  = @OMPI_FORTRAN_HAVE_ASYNCHRONOUS@
> openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/use-mpi-f08/mod/mpi-f08-interfaces.F90:!
>  ASYNCHRONOUS had to removed from the base argument because
> openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/use-mpi-f08/mod/pmpi-f08-interfaces.F90:!
>  ASYNCHRONOUS had to removed from the base argument because
> openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/use-mpi-f08/profile/pimrecv_f08.F90:
>    OMPI_FORTRAN_IGNORE_TKR_TYPE OMPI_ASYNCHRONOUS :: buf
> openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/use-mpi-f08/profile/pirecv_f08.F90:
>    OMPI_FORTRAN_IGNORE_TKR_TYPE OMPI_ASYNCHRONOUS :: buf
> openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/use-mpi-f08/profile/pirsend_f08.F90:
>    OMPI_FORTRAN_IGNORE_TKR_TYPE,
> INTENT(IN) OMPI_ASYNCHRONOUS :: buf
> openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/use-mpi-f08/profile/pisend_f08.F90:
>    OMPI_FORTRAN_IGNORE_TKR_TYPE, INTENT(IN) OMPI_ASYNCHRONOUS :: buf
> openmpi-3.1.0/openmpi-3.1.0/opal/mca/hwloc/hwloc1117/hwloc/Makefile.in:OMPI_FORTRAN_HAVE_ASYNCHRONOUS
>  = @OMPI_FORTRAN_HAVE_ASYNCHRONOUS@
> loki src 134
> 
> 
> The problematic file is different in both versions.
> 
> loki src 139 diff 
> openmpi-master/openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/accumulate_f08.F90
>  openmpi-3.1.0/openmpi-3.1.0/ompi/mpi/fortran/use-mpi-f08/accumulate_f08.F90
> 16c16
> <    OMPI_FORTRAN_IGNORE_TKR_TYPE, INTENT(IN), ASYNCHRONOUS :: origin_addr
> ---
> >    OMPI_FORTRAN_IGNORE_TKR_TYPE, INTENT(IN) :: origin_addr
> loki src 140
> 
> 
> Hopefully someone knows a solution. Otherwise I may have to wait for
> the next version of the Oracle compiler.
> 
> Best regards
> 
> Siegmar
> 
>>> On Jun 6, 2018, at 7:11 AM, Siegmar Gross 
>>> <siegmar.gr...@informatik.hs-fulda.de> wrote:
>>> 
>>> Hi,
>>> 
>>> I've tried to install openmpi-master-201806060243-64a5baa on my "SUSE Linux
>>> Enterprise Server 12.3 (x86_64)" with Sun C 5.15 (Oracle Developer Studio
>>> 12.6). Unfortunately I still get the following error that I already reported
>>> on April 12th and May 5th.
>>> 
>>> loki openmpi-master-201806060243-64a5baa-Linux.x86_64.64_cc 123 head -7 
>>> config.log | tail -1
>>>  $ ../openmpi-master-201806060243-64a5baa/configure 
>>> --prefix=/usr/local/openmpi-master_64_cc 
>>> --libdir=/usr/local/openmpi-master_64_cc/lib64 
>>> --with-jdk-bindir=/usr/local/jdk-10/bin 
>>> --with-jdk-headers=/usr/local/jdk-10/include JAVA_HOME=/usr/local/jdk-10 
>>> LDFLAGS=-m64 -mt -Wl,-z -Wl,noexecstack -L/usr/local/lib64 CC=cc CXX=CC 
>>> FC=f95 CFLAGS=-m64 -mt -I/usr/local/include CXXFLAGS=-m64 
>>> -I/usr/local/include FCFLAGS=-m64 CPP=cpp -I/usr/local/include CXXCPP=cpp 
>>> -I/usr/local/include --enable-mpi-cxx --enable-cxx-exceptions 
>>> --enable-mpi-java --with-valgrind=/usr/local/valgrind --with-hwloc=internal 
>>> --without-verbs --with-wrapper-cflags=-m64 -mt --with-wrapper-cxxflags=-m64 
>>> --with-wrapper-fcflags=-m64 --with-wrapper-ldflags=-mt --enable-debug
>>> loki openmpi-master-201806060243-64a5baa-Linux.x86_64.64_cc 124
>>> 
>>> 
>>> loki openmpi-master-201806060243-64a5baa-Linux.x86_64.64_cc 124 tail -20 
>>> log.make.Linux.x86_64.64_cc
>>>  PPFC     add_error_class_f08.lo
>>>  PPFC     add_error_code_f08.lo
>>>  PPFC     add_error_string_f08.lo
>>>  PPFC     aint_add_f08.lo
>>> 
>>>   OMPI_FORTRAN_IGNORE_TKR_TYPE, INTENT(IN), ASYNCHRONOUS :: origin_addr
>>>                                             ^
>>> "../../../../../openmpi-master-201806060243-64a5baa/ompi/mpi/fortran/use-mpi-f08/accumulate_f08.F90",
>>>  Line = 16, Column = 46: ERROR: Attributes ASYNCHRONOUS and INTENT must not 
>>> appear in the same attribute list.
>>> 
>>> f90comp: 194 SOURCE LINES
>>> f90comp: 1 ERRORS, 0 WARNINGS, 0 OTHER MESSAGES, 0 ANSI
>>> Makefile:4417: recipe for target 'accumulate_f08.lo' failed
>>> make[2]: *** [accumulate_f08.lo] Error 1
>>> make[2]: *** Waiting for unfinished jobs....
>>> make[2]: Leaving directory 
>>> '/export2/src/openmpi-master/openmpi-master-201806060243-64a5baa-Linux.x86_64.64_cc/ompi/mpi/fortran/use-mpi-f08'
>>> Makefile:3493: recipe for target 'all-recursive' failed
>>> make[1]: *** [all-recursive] Error 1
>>> make[1]: Leaving directory 
>>> '/export2/src/openmpi-master/openmpi-master-201806060243-64a5baa-Linux.x86_64.64_cc/ompi'
>>> Makefile:1894: recipe for target 'all-recursive' failed
>>> make: *** [all-recursive] Error 1
>>> loki openmpi-master-201806060243-64a5baa-Linux.x86_64.64_cc 125
>>> 
>>> 
>>> I would be grateful, if somebody can fix the problem or is it a problem of 
>>> the
>>> Oracle compiler?. Do you need anything else? Thank you very much for any 
>>> help
>>> in advance.
>>> 
>>> 
>>> Kind regards
>>> 
>>> Siegmar
>>> _______________________________________________
>>> users mailing list
>>> users@lists.open-mpi.org
>>> https://lists.open-mpi.org/mailman/listinfo/users
> _______________________________________________
> users mailing list
> users@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
_______________________________________________
users mailing list
users@lists.open-mpi.org
https://lists.open-mpi.org/mailman/listinfo/users

Reply via email to