On Mon, Oct 15, 2018 at 6:21 AM Antonio Trande <anto.tra...@gmail.com> wrote:
> Hi all. > > PETSc-3.10.2 compiled with OpenMPI-2.1.5 on Fedora 30 (devel branch) (or > with OpenMPI-2.1.1 on Fedora 28), on x86 architectures only: > This looks like an OpenMPI bug. Does it work with MPICH? Thanks, Matt > the 'src/snes/examples/tutorials/ex19' test has no end with the > following output: > > + export > > LD_LIBRARY_PATH=/builddir/build/BUILD/petsc-3.10.2/buildopenmpi_dir/i386/lib > + > > LD_LIBRARY_PATH=/builddir/build/BUILD/petsc-3.10.2/buildopenmpi_dir/i386/lib > + export PETSC_DIR=/builddir/build/BUILD/petsc-3.10.2/buildopenmpi_dir > + PETSC_DIR=/builddir/build/BUILD/petsc-3.10.2/buildopenmpi_dir > + export PETSC_ARCH=i386 > + PETSC_ARCH=i386 > + export MPI_INTERFACE_HOSTNAME=localhost > + MPI_INTERFACE_HOSTNAME=localhost > + export 'PETSCVALGRIND_OPTIONS= --tool=memcheck --leak-check=yes > --track-origins=yes' > + PETSCVALGRIND_OPTIONS=' --tool=memcheck --leak-check=yes > --track-origins=yes' > + export 'CFLAGS=-O0 -g -Wl,-z,now -fPIC' > + CFLAGS='-O0 -g -Wl,-z,now -fPIC' > + export 'CXXFLAGS=-O0 -g -Wl,-z,now -fPIC' > + CXXFLAGS='-O0 -g -Wl,-z,now -fPIC' > + export 'FFLAGS=-O0 -g -Wl,-z,now -fPIC -I/usr/lib/gfortran/modules' > + FFLAGS='-O0 -g -Wl,-z,now -fPIC -I/usr/lib/gfortran/modules' > + make -C buildopenmpi_dir test > > 'MPIEXEC=/builddir/build/BUILD/petsc-3.10.2/buildopenmpi_dir/lib/petsc/bin/petscmpiexec > -valgrind' > make: Entering directory > '/builddir/build/BUILD/petsc-3.10.2/buildopenmpi_dir' > Running test examples to verify correct installation > Using PETSC_DIR=/builddir/build/BUILD/petsc-3.10.2/buildopenmpi_dir and > PETSC_ARCH=i386 > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI > process > See http://www.mcs.anl.gov/petsc/documentation/faq.html > ==25868== Conditional jump or move depends on uninitialised value(s) > ==25868== at 0x8E2CCA3: opal_value_unload (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x6088607: ompi_proc_complete_init (in > /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x608C845: ompi_mpi_init (in > /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x60B2A97: PMPI_Init_thread (in > /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x49E0B0F: PetscInitialize (pinit.c:875) > ==25868== by 0x8049643: main (ex19.c:106) > ==25868== Uninitialised value was created by a stack allocation > ==25868== at 0x6088593: ompi_proc_complete_init (in > /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== > lid velocity = 0.0016, prandtl # = 1., grashof # = 1. > Number of SNES iterations = 2 > ==25868== 10 bytes in 1 blocks are definitely lost in loss record 11 of 189 > ==25868== at 0x40356A4: malloc (vg_replace_malloc.c:299) > ==25868== by 0xA890F6F: ??? (in > /usr/lib/openmpi/lib/openmpi/mca_pmix_pmix112.so) > ==25868== by 0xA88FEFD: pmix_bfrop_unpack_buffer (in > /usr/lib/openmpi/lib/openmpi/mca_pmix_pmix112.so) > ==25868== by 0xA8901D0: ??? (in > /usr/lib/openmpi/lib/openmpi/mca_pmix_pmix112.so) > ==25868== by 0xA89BED4: ??? (in > /usr/lib/openmpi/lib/openmpi/mca_pmix_pmix112.so) > ==25868== by 0xA899C7C: ??? (in > /usr/lib/openmpi/lib/openmpi/mca_pmix_pmix112.so) > ==25868== by 0x8E63AFD: opal_libevent2022_event_base_loop (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0xA897C32: ??? (in > /usr/lib/openmpi/lib/openmpi/mca_pmix_pmix112.so) > ==25868== by 0x61635DD: start_thread (in > /usr/lib/libpthread-2.28.9000.so) > ==25868== by 0x626A699: clone (in /usr/lib/libc-2.28.9000.so) > ==25868== > ==25868== 10 bytes in 1 blocks are definitely lost in loss record 12 of 189 > ==25868== at 0x40356A4: malloc (vg_replace_malloc.c:299) > ==25868== by 0x6201519: strdup (in /usr/lib/libc-2.28.9000.so) > ==25868== by 0x8E4B0FE: mca_base_var_enum_create_flag (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x8E5DDA5: ??? (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x8E4CC34: mca_base_framework_register (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x8E4CCE0: mca_base_framework_open (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x60DD028: ??? (in /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x8E4CD68: mca_base_framework_open (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x608C410: ompi_mpi_init (in > /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x60B2A97: PMPI_Init_thread (in > /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x49E0B0F: PetscInitialize (pinit.c:875) > ==25868== by 0x8049643: main (ex19.c:106) > ==25868== > ==25868== 17 bytes in 1 blocks are definitely lost in loss record 79 of 189 > ==25868== at 0x40356A4: malloc (vg_replace_malloc.c:299) > ==25868== by 0x6201519: strdup (in /usr/lib/libc-2.28.9000.so) > ==25868== by 0x8E4B0FE: mca_base_var_enum_create_flag (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x8E5DDC0: ??? (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x8E4CC34: mca_base_framework_register (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x8E4CCE0: mca_base_framework_open (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x60DD028: ??? (in /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x8E4CD68: mca_base_framework_open (in > /usr/lib/openmpi/lib/libopen-pal.so.20.10.4) > ==25868== by 0x608C410: ompi_mpi_init (in > /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x60B2A97: PMPI_Init_thread (in > /usr/lib/openmpi/lib/libmpi.so.20.10.3) > ==25868== by 0x49E0B0F: PetscInitialize (pinit.c:875) > ==25868== by 0x8049643: main (ex19.c:106) > ... > > > > -- > --- > Antonio Trande > Fedora Project > mailto 'sagitter at fedoraproject dot org' > GPG key: 0x5E212EE1D35568BE > GPG key server: https://keys.fedoraproject.org/ > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>