Hello Jeff and Gilles,
I just logged in to see the archives and
this message of Gilles -
https://www.mail-archive.com/users@lists.open-mpi.org//msg31219.html and
this message of Jeff -
https://www.mail-archive.com/users@lists.open-mpi.org//msg31217.html are
very use
The "error *** glibc detected *** $(PROGRAM): double free or corruption" is
ubiquitous and rarely has anything to do with MPI.
As Gilles said, use a debugger to figure out why your application is
corrupting the heap.
Jeff
On Wed, Jun 14, 2017 at 3:31 AM, ashwin .D wrote:
> Hello,
>
Hi,
at first, i suggest you decide which Open MPI version you want to use.
the most up to date versions are 2.0.3 and 2.1.1
then please provide all the info Jeff previously requested.
ideally, you would write a simple and standalone program that exhibits
the issue, so we can reproduce and inv
Hello,
I found a thread with Intel MPI(although I am using gfortran
4.8.5 and OpenMPI 2.1.1) -
https://software.intel.com/en-us/forums/intel-fortran-compiler-for-linux-and-mac-os-x/topic/564266
but the error the OP gets is the same as mine
*** glibc detected *** ./a.out: double free or c
On Jun 13, 2017, at 8:22 AM, ashwin .D wrote:
>
> Also when I try to build and run a make check I get these errors - Am I clear
> to proceed or is my installation broken ? This is on Ubuntu 16.04 LTS.
>
> ==
>Open MPI 2.1.1: test/datatype/tes
Hi
Can you please post your configure command line for 2.1.1 ?
On which architecture are you running? x86_64 ?
Cheers,
Gilles
"ashwin .D" wrote:
>Also when I try to build and run a make check I get these errors - Am I clear
>to proceed or is my installation broken ? This is on Ubuntu 16.04 LT
If you are not using external32 in datatypes code, this issue doesn't
matter. I don't think most implementations support external32...
Double free indicates application error. Such errors are possible but
extremely rare inside of MPI libraries. The incidence of applications
corrupting memory is ab
Also when I try to build and run a make check I get these errors - Am I
clear to proceed or is my installation broken ? This is on Ubuntu 16.04
LTS.
==
Open MPI 2.1.1: test/datatype/test-suite.log
==
Hello,
I am using OpenMPI 2.0.0 with a computational fluid dynamics
software and I am encountering a series of errors when running this with
mpirun. This is my lscpu output
CPU(s):4
On-line CPU(s) list: 0-3
Thread(s) per core:2
Core(s) per socket:2
Socket(s):