If you build your application with intel compilers and -i8, then openmpi
must also be built with intel compilers and -i8.
Cheers,
Gilles
On Sunday, April 24, 2016, Elio Physics wrote:
> Well, I changed the compiler from mpif90 to mpiifort with corresponding
> flags -i8 -g and recompiled. i am
Elio
You should ask this question in the forum of the simulation program you are
using. These failures have most likely nothing to do with MPI (or, at
least, OpenMPI) so this is the wrong
place for these questions.
Here is a bit of suggestion: does your program run without MPI at all?
(i.e. in a
Well, I changed the compiler from mpif90 to mpiifort with corresponding flags
-i8 -g and recompiled. i am not getting the segmentation fault problem anymore
and the program runs but later stops with no errors (except that the Fermi
energy was not found!) and with some strange empty files that ar
Hi Gilles,
I don't know what happened, but the files are not available now
and they were definitely available when I answered the email from
Ralph. The files also have a different timestamp now. This is an
extract from my email to Ralph for Solaris Sparc.
-rwxr-xr-x 1 root root 977 Apr 19 19
I don’t see any way this could be compilation related - I suspect there is
simply some error in the program (e.g., forgetting to initialize some memory
region).
> On Apr 23, 2016, at 8:03 AM, Elio Physics wrote:
>
> Hello Andy,
>
> the program is not mine. I have got it from a group upon req
Hello Andy,
the program is not mine. I have got it from a group upon request. It might be
program related because I run other codes such as quantum espresso and work
perfectly fine although it is the cluster people who compiled it. Since I have
compiled the program I am having problems with, I
The challenge for the MPI experts here (of which I am NOT one!) is
that the problem appears to be in your program; MPI is simply
reporting that your program failed. If you got the program from
someone else, you will need to solicit their help. If you wrote it,
well, it is
I am not really an expert with gdb. What is the core file? and how to use gdb?
I have got three files as an output when the executable is used. One is the
actual output which stops and the other two are error files (from which I knew
about the segmentation fault).
thanks
___
valgrind isn’t going to help here - there are multiple reasons why your
application could be segfaulting. Take a look at the core file with gdb and
find out where it is failing.
> On Apr 22, 2016, at 10:20 PM, Elio Physics wrote:
>
> One more thing i forgot to mention in my previous e-mail. In
One more thing i forgot to mention in my previous e-mail. In the output file I
get the following message:
2 total processes killed (some possibly by mpirun during cleanup)
Thanks
From: users on behalf of Elio Physics
Sent: Saturday, April 23, 2016 3:07 AM
I have used valgrind and this is what i got:
valgrind mpirun ~/Elie/SPRKKR/bin/kkrscf6.3MPI Fe_SCF.inp >
scf-51551.jlborges.fisica.ufmg.br.out
==8135== Memcheck, a memory error detector
==8135== Copyright (C) 2002-2012, and GNU GPL'd, by Julian Seward et al.
==8135== Using Valgrind-3.8.1 and Lib
11 matches
Mail list logo