Check out the FAQ:
http://www.lam-mpi.org/faq/category6.php3
On May 10, 2007, at 9:50 PM, Code Master wrote:
On 5/11/07, Tim Prins wrote: On Thursday 10
May 2007 07:19 pm, Code Master wrote:
> I am a newbie in openmpi. I have just compiled a program with -g
-pg (an
> mpi program with
On 5/11/07, Tim Prins wrote:
On Thursday 10 May 2007 07:19 pm, Code Master wrote:
> I am a newbie in openmpi. I have just compiled a program with -g -pg
(an
> mpi program with a listener thread, which all MPI calls except
> initialization and MPI_Finalize are placed within) and I run
it. How
On Thursday 10 May 2007 07:19 pm, Code Master wrote:
> I am a newbie in openmpi. I have just compiled a program with -g -pg (an
> mpi program with a listener thread, which all MPI calls except
> initialization and MPI_Finalize are placed within) and I run it. However
> it crashes and I can't fin
I am a newbie in openmpi. I have just compiled a program with -g -pg (an
mpi program with a listener thread, which all MPI calls except
initialization and MPI_Finalize are placed within) and I run it. However
it crashes and I can't find any core dump, even I set the core dump max size
to 10
Good to know. This suggests that building VASP properly with Open
MPI should work properly; perhaps there's some secret sauce in the
Makefile somewhere...? Off list, someone cited the following to me:
-
Also VASP has a forum for things like this too.
http://cms.mpi.univie.ac.at/vasp-for
On Thu, 2007-05-10 at 20:07 -0400, Jeff Squyres wrote:
> Brian --
>
> Didn't you add something to fix exactly this problem recently? I
> have a dim recollection of seeing a commit go by about this...?
>
> (I advised Steve in IM to use --disable-ipv6 in the meantime)
>
Yes, disabling it worke
Brian --
Didn't you add something to fix exactly this problem recently? I
have a dim recollection of seeing a commit go by about this...?
(I advised Steve in IM to use --disable-ipv6 in the meantime)
On May 10, 2007, at 1:25 PM, Steve Wise wrote:
I'm trying to run a job specifically over
On Thursday 10 May 2007 11:35 am, Laurent Nguyen wrote:
> Hi Tim,
>
> Ok, I thank you for all theses precisions. I also add "static int
> pls_poe_cancel_operation(void)" similary to you, and I can continue the
> compilation. But, I had another problem. In ompi/mpi/cxx/mpicxx.cc,
> three variables a
I'm trying to run a job specifically over tcp and the eth1 interface.
It seems to be barfing on trying to listen via ipv6. I don't want ipv6.
How can I disable it?
Here's my mpirun line:
[root@vic12-10g ~]# mpirun --n 2 --host vic12,vic20 --mca btl self,tcp -mca
btl_tcp_if_include eth1 /root/IM
Hi Tim,
Ok, I thank you for all theses precisions. I also add "static int
pls_poe_cancel_operation(void)" similary to you, and I can continue the
compilation. But, I had another problem. In ompi/mpi/cxx/mpicxx.cc,
three variables are already defined. The preprocessor set them to the
constant
Hi Laurent,
Unfortunately, as far as I know, none of the current Open MPI developers has
access to a system with POE, so the POE process launcher has fallen into
disrepair. Attached is a patch that should allow you to compile (however, you
may also need to add #include to pls_poe_module.c).
I have previously been running parallel VASP happily with an old,
prerelease version of OpenMPI:
[terry@nocona Vasp.4.6-OpenMPI]$
head /home/terry/Install_trees/OpenMPI-1.0rc6/config.log
This file contains any messages produced by compilers while
running configure, to aid debugging if configure
Hello,
I tried to install OpenMPI 1.2 but I saw there some problems when
compiling files with POE. When OpenMPI 1.2.1 was released, I saw in the
bug fixes that this problem was fixed. Then I tried, but it still
doesn't work. The problem comes from orte/mca/pls/poe/pls_poe_module.c.
A static f
13 matches
Mail list logo