Re: [OMPI users] Compiling OpenMPI with PGI pgc++

2014-02-01 Thread Reuti
Hi, Am 01.02.2014 um 15:10 schrieb Jiri Kraus: > sorry but I don't know the details of the issue. But although the error is > reported as pgc++ not being link compatible to pgcc by OpenMPI configure the > error in the config.log is a complier error. So I don't think that this is a > linking is

Re: [OMPI users] Use of __float128 with openmpi

2014-02-01 Thread Jeff Hammond
See Section 5.9.5 of MPI-3 or the section named "User-Defined Reduction Operations" but presumably numbered differently in older copies of the MPI standard. An older but still relevant online reference is http://www.mpi-forum.org/docs/mpi-2.2/mpi22-report/node107.htm There is a proposal to suppor

Re: [OMPI users] Use of __float128 with openmpi

2014-02-01 Thread Tim Prince
On 02/01/2014 12:42 PM, Patrick Boehl wrote: Hi all, I have a question on datatypes in openmpi: Is there an (easy?) way to use __float128 variables with openmpi? Specifically, functions like MPI_Allreduce seem to give weird results with __float128. Essentially all I found was http://beige

Re: [OMPI users] Use of __float128 with openmpi

2014-02-01 Thread Tim Prince
On 02/01/2014 12:42 PM, Patrick Boehl wrote: Hi all, I have a question on datatypes in openmpi: Is there an (easy?) way to use __float128 variables with openmpi? Specifically, functions like MPI_Allreduce seem to give weird results with __float128. Essentially all I found was http://beige

[OMPI users] Use of __float128 with openmpi

2014-02-01 Thread Patrick Boehl
Hi all, I have a question on datatypes in openmpi: Is there an (easy?) way to use __float128 variables with openmpi? Specifically, functions like MPI_Allreduce seem to give weird results with __float128. Essentially all I found was http://beige.ucs.indiana.edu/I590/node100.html where they

Re: [OMPI users] Implementation of TCP v/s OpenIB (Eager and Rendezvous) protocols

2014-02-01 Thread Siddhartha Jana
Thanks for the reply Jeff. This is directional. On 01-Feb-2014 7:51 am, "Jeff Squyres (jsquyres)" wrote: > On Jan 31, 2014, at 2:49 AM, Siddhartha Jana > wrote: > > > Sorry for the typo: > > ** I was hoping to understand the impact of OpenMPI's implementation of > these protocols using traditio

Re: [OMPI users] openmpi 1.7.4rc1 and f08 interface

2014-02-01 Thread Jeff Squyres (jsquyres)
Thanks! I noted your comment on the ticket so that it doesn't get lost. I haven't had a chance to look into this yet because we've been focusing on getting 1.7.4 out the door, and this has been identified as a 1.7.5 fix. On Jan 31, 2014, at 3:31 PM, Åke Sandgren wrote: > On 01/28/2014 08:26

Re: [OMPI users] Compiling OpenMPI with PGI pgc++

2014-02-01 Thread Jiri Kraus
Hi Reuti, sorry but I don't know the details of the issue. But although the error is reported as pgc++ not being link compatible to pgcc by OpenMPI configure the error in the config.log is a complier error. So I don't think that this is a linking issue. > When I get it right, it should be a fe

Re: [OMPI users] Running on two nodes slower than running on one node

2014-02-01 Thread Victor
Thank you all for your help. --bind-to-core increased the cluster performance by approximately 10%, so in addition to the improvements through the implementation of Open-MX, the performance now scales within expectations - not linear, but much better than with the original setup. On 30 January 20

Re: [OMPI users] Implementation of TCP v/s OpenIB (Eager and Rendezvous) protocols

2014-02-01 Thread Jeff Squyres (jsquyres)
On Jan 31, 2014, at 2:49 AM, Siddhartha Jana wrote: > Sorry for the typo: > ** I was hoping to understand the impact of OpenMPI's implementation of > these protocols using traditional TCP. > > This is the paper I was referring to: > Woodall, et al., "High Performance RDMA Protocols in HPC". >

Re: [OMPI users] MPI hangs when application compiled with -O3, runs fine with -O0

2014-02-01 Thread Jeff Squyres (jsquyres)
Sorry for the massive delay in replying; I'm going through my inbox this morning and finding old mails that I initially missed. :-\ More below. On Jan 17, 2014, at 8:45 AM, Julien Bodart wrote: > version: 1.6.5 (compiled with Intel compilers) > > command used: > mpirun --machinefile mfile -