Thanks Bert for the reply but having these macros in ompi/version.h only if a
special option is given to configure is useless for what I would like to enable
in OpenMPI with the present suggestion.
This is because the whole idea is to make it possible to write portable MPI
compliant C/C++ progr
Hello,
I can see that Patrick posted my same error on:
http://www.open-mpi.org/community/lists/devel/2006/12/1230.php
Can I please find out the status? I have a simple test case which
demonstrates the --prefix is not working on mpirun. I build into
/usr/local/openmpi-1.1.3
and it works. Ho
Also, I see mention in your FAQ about config.log. My openmpi does not
appear to be generating it, at least not anywhere in the install tree.
How can I enable the creation of the log file?
Thanks,
Dennis
-Original Message-
From: Dennis McRitchie
Sent: Friday, February 02, 2007 6:08
When I submit a simple job (described below) using PBS, I always get one
of the following two errors:
1) [adroit-28:03945] [0,0,1]-[0,0,0] mca_oob_tcp_peer_recv_blocking:
recv() failed with errno=104
2) [adroit-30:03770] [0,0,3]-[0,0,0] mca_oob_tcp_peer_complete_connect:
connection failed (errno=1
Hi Todd
To help us provide advice, could you tell us what version of OpenMPI you are
using?
Meantime, try adding ³-mca pls_rsh_num_concurrent 200² to your mpirun
command line. You can up the number of concurrent daemons we launch to
anything your system will support basically, we limit the numb
I've been checking the OpenMPI code, trying to find something, but
still no luck. I'll continue checking the code.
On 2/2/07, Robert Latham wrote:
On Tue, Jan 30, 2007 at 04:55:09PM -0500, Ivan de Jesus Deras Tabora wrote:
> Then I find all the references to the MPI_Type_create_subarray and
>
On Tue, Jan 30, 2007 at 04:55:09PM -0500, Ivan de Jesus Deras Tabora wrote:
> Then I find all the references to the MPI_Type_create_subarray and
> create a little program just to test that part of the code, the code I
> created is:
...
> After running this little program using mpirun, it raises the
Is there any available documentation or write-ups of hints or general
information on
the task of porting an existing MPI application from a different MPI
implementation to
OpenMPI? We have an app using mpich1 and it needs some updating or porting to
run on a new platform so I figured it would be
Wrong mailing list. This is for Open MPI related questions, all MPICH
& MVAPICH should be redirected toward their respective mailing-lists.
Thanks,
george.
On Feb 2, 2007, at 5:16 AM, Vadivelan Ranjith wrote:
Hi All
I used mpich2-1.0.3 to compile our code. Our code compiled fine.
But
I have OpenMPI running fine for a small/medium number of tasks (simple
hello or cpi program). But when I try 700 or 800 tasks, it hangs,
apparently on startup. I think this might be related to LDAP, since if I
try to log into my account while the job is hung, I get told my username
doesn't exist. H
That really did fix it, George:
# mpirun --prefix $MPIHOME -hostfile ~/testdir/hosts --mca btl
tcp,self --mca btl_tcp_if_exclude ib0,ib1 ~/testdir/hello
Hello from Alex' MPI test program
Process 0 on dr11.lsf.platform.com out of 2
Hello from Alex' MPI test program
Process 1 on compute-0-0.local o
Hi All
I used mpich2-1.0.3 to compile our code. Our code compiled fine. But when I try
to test our code in intel mpi, It gave the following error
ERROR: gfortran compiler is not in PATH for driver: mpif90
my .bashrc having following path
source /opt/intel/fc/9.1.037/bin/ifortvars.sh
source /opt/
Hello,
you can build your ompi with --with-devel-headers and use the header
:
#define OMPI_MAJOR_VERSION 1
#define OMPI_MINOR_VERSION 1
#define OMPI_RELEASE_VERSION 4
#define OMPI_GREEK_VERSION ""
Bert
Audet, Martin wrote:
> Hi,
>
> I would like to suggest you to add macros indicating the vers
Alex,
Can should try to limit the ethernet devices used by Open MPI during
the execution. Please add "--mca btl_tcp_if_exclude eth1,ib0,ib1" to
your mpirun command line and give it a try.
Thanks,
george.
On Feb 1, 2007, at 10:29 PM, Alex Tumanov wrote:
On 2/1/07, Galen Shipman wro
14 matches
Mail list logo