On May 6, 2014, at 6:24 PM, Clay Kirkland wrote:
> Got it to work finally. The longer line doesn't work.
>
> But if I take off the -mca oob_tcp_if_include 192.168.0.0/16 part then
> everything works from
> every combination of machines I have.
Interesting - I'm surprised, but glad it worked
Got it to work finally. The longer line doesn't work.
But if I take off the -mca oob_tcp_if_include 192.168.0.0/16 part then
everything works from
every combination of machines I have.
And as to any MPI having trouble, in my original posting I stated that I
installed lam mpi
on the same hardwar
-mca btl_tcp_if_include 192.168.0.0/16 -mca oob_tcp_if_include 192.168.0.0/16
should do the trick. Any MPI is going to have trouble with your arrangement -
just need a little hint to help figure it out.
On May 6, 2014, at 5:14 PM, Clay Kirkland wrote:
> Someone suggested using some network a
Someone suggested using some network address if all machines are on same
subnet.
They are all on the same subnet, I think. I have no idea what to put for
a param there.
I tried the ethernet address but of course it couldn't be that simple.
Here are my ifconfig
outputs from a couple of machines:
Are these NICs on the same IP subnet, per chance? You don't have to specify
them by name - you can say "-mca btl_tcp_if_include 10.10/16" or something.
On May 6, 2014, at 4:50 PM, Clay Kirkland wrote:
> Well it turns out I can't seem to get all three of my machines on the same
> page.
> Two
Well it turns out I can't seem to get all three of my machines on the
same page.
Two of them are using eth0 and one is using eth1. Centos seems unable to
bring
up multiple network interfaces for some reason and when I use the mca param
to
use eth0 it works on two machines but not the other. I
That last trick seems to work. I can get it to work once in a while with
those tcp options but it is
tricky as I have three machines and two of them use eth0 as primary network
interface and one
uses eth1. But by fiddling with network options and perhaps moving a
cable or two I think I can
get
Are you using TCP as the MPI transport?
If so, another thing to try is to limit the IP interfaces that MPI uses for its
traffic to see if there's some kind of problem with specific networks.
For example:
mpirun --mca btl_tcp_if_include eth0 ...
If that works, then try adding in any/all othe
On May 6, 2014, at 9:40 AM, Imran Ali wrote:
> My install was in my user directory (i.e $HOME). I managed to locate the
> source directory and successfully run make uninstall.
FWIW, I usually install Open MPI into its own subdir. E.g.,
$HOME/installs/openmpi-x.y.z. Then if I don't want that
6. mai 2014 kl. 15:34 skrev Jeff Squyres (jsquyres) :
> On May 6, 2014, at 9:32 AM, Imran Ali wrote:
>
>> I will attempt that than. I read at
>>
>> http://www.open-mpi.org/faq/?category=building#install-overwrite
>>
>> that I should completely uninstall my previous version.
>
> Yes, that is
On May 6, 2014, at 9:32 AM, Imran Ali wrote:
> I will attempt that than. I read at
>
> http://www.open-mpi.org/faq/?category=building#install-overwrite
>
> that I should completely uninstall my previous version.
Yes, that is best. OR: you can install into a whole separate tree and ignore
th
6. mai 2014 kl. 14:56 skrev Jeff Squyres (jsquyres) :
> The thread support in the 1.6 series is not very good. You might try:
>
> - Upgrading to 1.6.5
> - Or better yet, upgrading to 1.8.1
>
I will attempt that than. I read at
http://www.open-mpi.org/faq/?category=building#install-overwrite
The thread support in the 1.6 series is not very good. You might try:
- Upgrading to 1.6.5
- Or better yet, upgrading to 1.8.1
On May 6, 2014, at 7:24 AM, Imran Ali wrote:
> I get the following error when I try to run the following python code
>
> import mpi4py.MPI as MPI
> comm = MPI.COMM_
I get the following error when I try to run the following python
code
import mpi4py.MPI as MPI
comm = MPI.COMM_WORLD
MPI.File.Open(comm,"some.file")
$ mpirun -np 1 python
test_mpi.py
Traceback (most recent call last):
File "test_mpi.py", line
3, in
MPI.File.Open(comm," h5ex_d_alloc.h5")
14 matches
Mail list logo