Re: [OMPI users] more migrating to MPI_F08

2017-03-23 Thread Jeff Squyres (jsquyres)
On Mar 23, 2017, at 3:20 PM, Tom Rosmond wrote: > > I had stared at those lines many times and it didn't register that (count) > was explicitly specifying only 1-D is allowed. Pretty cryptic. I wonder how > many other fortran programmers will be bit by this? My understanding is that that is s

Re: [OMPI users] more migrating to MPI_F08

2017-03-23 Thread Tom Rosmond
Thanks, Jeff, I had stared at those lines many times and it didn't register that (count) was explicitly specifying only 1-D is allowed. Pretty cryptic. I wonder how many other fortran programmers will be bit by this? T. Rosmond On 03/23/2017 10:50 AM, Jeff Squyres (jsquyres) wrote: Actua

Re: [OMPI users] openmpi installation error

2017-03-23 Thread Renato Golin
On 23 March 2017 at 17:39, Vinay Mittal wrote: > I need mpirun to run a genome assembler. > > Linux installation of openmpi-2.1.0 stops during make all saying: > > "Perl 5.006 required--this is only version 5.00503, stopped at > /usr/share/perl5/vars.pm line 3." This looks like Perl's own verific

Re: [OMPI users] more migrating to MPI_F08

2017-03-23 Thread Jeff Squyres (jsquyres)
Actually, MPI-3.1 p90:37-45 explicitly says that the array_of_blocklengths and array_of_displacements arrays must be both 1D and of length count. If my Fortran memory serves me correctly, I think you can pass in an array subsection if your blocklengths/displacements are part of a larger array (t

Re: [OMPI users] openmpi installation error

2017-03-23 Thread Jeff Squyres (jsquyres)
That's a pretty weird error. We don't require any specific version of perl that I'm aware of. Are you sure that it's Open MPI's installer that is kicking out the error? Can you send all the information listed here: https://www.open-mpi.org/community/help/ > On Mar 23, 2017, at 1:39 PM,

[OMPI users] openmpi installation error

2017-03-23 Thread Vinay Mittal
I need mpirun to run a genome assembler. Linux installation of openmpi-2.1.0 stops during make all saying: "Perl 5.006 required--this is only version 5.00503, stopped at /usr/share/perl5/vars.pm line 3." Is it really that Perl specific? I am following the standard installation path without root

[OMPI users] more migrating to MPI_F08

2017-03-23 Thread Tom Rosmond
Hello, Attached is a simple MPI program demonstrating a problem I have encountered with 'MPI_Type_create_hindexed' when compiling with the 'mpi_f08' module. There are 2 blocks of code that are only different in how the length and displacement arrays are declared. I get indx.f90(50): error

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Götz Waschk
On Thu, Mar 23, 2017 at 2:37 PM, Götz Waschk wrote: > I have also tried mpirun --mca coll ^tuned --mca btl tcp,openib , this > finished fine, but was quite slow. I am currently testing with mpirun > --mca coll ^tuned This one ran also fine. ___ users ma

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Götz Waschk
Hi Gilles, On Thu, Mar 23, 2017 at 10:33 AM, Gilles Gouaillardet wrote: > mpirun --mca btl openib,self ... Looks like this didn't finish, I had to terminate the job during the Gather with 32 processes step. > Then can you try > mpirun --mca coll ^tuned --mca btl tcp,self ... As mentioned, this

Re: [OMPI users] Help with Open MPI 2.1.0 and PGI 16.10: Configure and C++

2017-03-23 Thread Gilles Gouaillardet
Matt, a C++ compiler is required to configure Open MPI. That being said, C++ compiler is only used if you build the C++ bindings (That were removed from MPI-3) And unless you plan to use the mpic++ wrapper (with or without the C++ bindings), a valid C++ compiler is not required at all. /* configur

Re: [OMPI users] Help with Open MPI 2.1.0 and PGI 16.10: Configure and C++

2017-03-23 Thread Reuti
Hi, Am 22.03.2017 um 20:12 schrieb Matt Thompson: > […] > > Ah. PGI 16.9+ now use pgc++ to do C++ compiling, not pgcpp. So, I hacked > configure so that references to pgCC (nonexistent on macOS) are gone and all > pgcpp became pgc++, but: This is not unique to macOS. pgCC used STLPort STL and

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Götz Waschk
Hi Gilles, I'm currently testing and here are some preliminary results: On Thu, Mar 23, 2017 at 10:33 AM, Gilles Gouaillardet wrote: > Can you please try > mpirun --mca btl tcp,self ... this failed to produce the program output, there were lots of errors like this: [pax11-00][[54124,1],31][btl_

Re: [OMPI users] a question about MPI dynamic process manage

2017-03-23 Thread Jeff Squyres (jsquyres)
It's likely a lot more efficient to MPI_COMM_SPAWN *all* of your children at once, and then subdivide up the resulting newcomm communicator as desired. It is *possible* to have a series MPI_COMM_SPAWN calls that spawn a single child process, and then later join all of those children into a singl

Re: [OMPI users] Erors and segmentation faults when installing openmpi-2.1

2017-03-23 Thread Jeff Squyres (jsquyres)
Note that Open MPI and MPICH are different implementations of the MPI specification. If you are mixing an Open MPI tarball install with an MPICH apt install, things will likely go downhill from there. You need to ensure to use Open MPI *or* MPICH. > On Mar 23, 2017, at 5:38 AM, Dimitrova, Mar

[OMPI users] Erors and segmentation faults when installing openmpi-2.1

2017-03-23 Thread Dimitrova, Maria
Hello, I am setting up a freshly installed Ubuntu 16.04 computer to do some parallel programming and I need the MPI compilers for C and Fortran. Using the provided tar archive in the download page produces a series of errors (a very long list because I tried running make all many times but I

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Gilles Gouaillardet
Can you please try mpirun --mca btl tcp,self ... And if it works mpirun --mca btl openib,self ... Then can you try mpirun --mca coll ^tuned --mca btl tcp,self ... That will help figuring out whether the error is in the pml or the coll framework/module Cheers, Gilles On Thursday, March 23, 2017

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Åke Sandgren
Ok, we have E5-2690v4's and Connect-IB. On 03/23/2017 10:11 AM, Götz Waschk wrote: > On Thu, Mar 23, 2017 at 9:59 AM, Åke Sandgren > wrote: >> E5-2697A which version? v4? > Hi, yes, that one: > Intel(R) Xeon(R) CPU E5-2697A v4 @ 2.60GHz > > Regards, Götz > __

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Götz Waschk
On Thu, Mar 23, 2017 at 9:59 AM, Åke Sandgren wrote: > E5-2697A which version? v4? Hi, yes, that one: Intel(R) Xeon(R) CPU E5-2697A v4 @ 2.60GHz Regards, Götz ___ users mailing list users@lists.open-mpi.org https://rfd.newmexicoconsortium.org/mailman/li

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Åke Sandgren
E5-2697A which version? v4? On 03/23/2017 09:53 AM, Götz Waschk wrote: > Hi Åke, > > I have E5-2697A CPUs and Mellanox ConnectX-3 FDR Infiniband. I'm using > EL7.3 as the operating system. -- Ake Sandgren, HPC2N, Umea University, S-90187 Umea, Sweden Internet: a...@hpc2n.umu.se Phone: +46 90

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Götz Waschk
Hi Howard, I had tried to send config.log of my 2.1.0 build, but I guess it was too big for the list. I'm trying again with a compressed file. I have based it on the OpenHPC package. Unfortunately, it still crashes with disabling the vader btl with this command line: mpirun --mca btl "^vader" IMB-

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Götz Waschk
Hi Åke, I have E5-2697A CPUs and Mellanox ConnectX-3 FDR Infiniband. I'm using EL7.3 as the operating system. Regards, Götz Waschk On Thu, Mar 23, 2017 at 9:28 AM, Åke Sandgren wrote: > Since i'm seeing similar Bus errors from both openmpi and other places > on our system I'm wondering, what ha

Re: [OMPI users] Openmpi 1.10.4 crashes with 1024 processes

2017-03-23 Thread Åke Sandgren
Since i'm seeing similar Bus errors from both openmpi and other places on our system I'm wondering, what hardware do you have? CPU:s, interconnect etc. On 03/23/2017 08:45 AM, Götz Waschk wrote: > Hi Howard, > > I have attached my config.log file for version 2.1.0. I have based it > on the OpenH