You didn't say, but I assume that my other two statements were
therefore correct (1.1.2 works with a static F90 library, 1.2.3 does
not work).
Do you need the MPI F90 bindings? If not, does --disable-mpi-f90
work for you?
On Jul 5, 2007, at 1:37 PM, Yip, Elizabeth L wrote:
1.2.1 doe
Ok, that unfortunately doesn't make much sense -- I don't know what
opal_event_set() inside opal_event_init() would cause a segv.
Can you recompile OMPI with -g and re-run this test? The "where"
information from gdb will then give us more information.
On Jul 5, 2007, at 12:38 PM, Ricardo
On Jul 5, 2007, at 4:02 PM, Dennis McRitchie wrote:
Any idea why the main program can't be found when running under
mpirun?
Just to be sure: you compiled your test MPI application with -g, right?
Does openmpi need to be built with either --enable-debug or
--enable-mem-debug? The "configure
Hi Jody,
Sorry for the super long delay. I don't know how this one got lost...
I run like this all the time. Unfortunately, it is not as simple as I
would like. Here is what I do:
1. Log into the machine using ssh -X
2. Run mpirun with the following parameters:
-mca pls rsh (This makes sure
Dear Tim and Scott
I followed the suggestions made:
>
> So you should either pass '-mca btl mx,sm,self', or just pass
> nothing at all.
> Open MPI is fairly smart at figuring out what components to
> use, so you really should not need to specify anything.
>
Using
node001>mpirun --mca btl
SLIM H.A. wrote:
Dear Tim and Scott
I followed the suggestions made:
So you should either pass '-mca btl mx,sm,self', or just pass
nothing at all.
Open MPI is fairly smart at figuring out what components to
use, so you really should not need to specify anything.
Using
node001>mpirun -
Hi Tim
Thank You for your reply.
Unfortunately my workstation has died,
and even when i try to run openmpi application
in a simple way, i get errors:
jody@aim-nano_02 /home/aim-cari/jody $ mpirun -np 2 --hostfile hostfile ./a.out
bash: orted: command not found
[aim-nano_02:22145] ERROR: A daemo
On Jul 6, 2007, at 12:05 PM, Alex Tumanov wrote:
Eureka! I managed to get it working despite the incorrect _initial_
./configure invocation. For those interested, here are my compilation
options:
# cat ompi_build.sh
#!/bin/sh
rpmbuild --rebuild -D "configure_options \
-
Hi Tim
(I accidentally sent the previous message before it was ready - here's
the complete one)
Thank You for your reply.
Unfortunately my workstation, on which i could successfully run openmpi
applications, has died. But one my replacement machine (which
i assume i have setup in an equivalent way
jody wrote:
Hi Tim
(I accidentally sent the previous message before it was ready - here's
the complete one)
Thank You for your reply.
Unfortunately my workstation, on which i could successfully run openmpi
applications, has died. But one my replacement machine (which
i assume i have setup in an e
Tim,
thanks for your suggestions.
There seems to be something wrong with the PATH:
jody@aim-nano_02 ~/progs $ ssh 130.60.49.128 printenv | grep PATH
PATH=/usr/bin:/bin:/usr/sbin:/sbin
which i don't understand. Logging via ssh into 130.60.49.128 i get:
jody@aim-nano_02 ~/progs $ ssh 130.60.49.128
On Monday 09 July 2007 12:52:29 pm jody wrote:
> Tim,
> thanks for your suggestions.
> There seems to be something wrong with the PATH:
> jody@aim-nano_02 ~/progs $ ssh 130.60.49.128 printenv | grep PATH
> PATH=/usr/bin:/bin:/usr/sbin:/sbin
>
> which i don't understand. Logging via ssh into 130.60.
12 matches
Mail list logo