Re: [OMPI users] Using Multiple Gigabit Ethernet Interface

2006-03-13 Thread Michael Kluskens
On Mar 11, 2006, at 1:00 PM, Jayabrata Chakrabarty wrote: Hi I have been looking for information on how to use multiple Gigabit Ethernet Interface for MPI communication. So far what i have found out is i have to use mca_btl_tcp. But what i wish to know, is what IP Address to assign to each

Re: [OMPI users] Using Multiple Gigabit Ethernet Interface

2006-03-13 Thread Brian Barrett
On Mar 11, 2006, at 1:00 PM, Jayabrata Chakrabarty wrote: Hi I have been looking for information on how to use multiple Gigabit Ethernet Interface for MPI communication. So far what i have found out is i have to use mca_btl_tcp. But what i wish to know, is what IP Address to assign to each

Re: [OMPI users] Using Multiple Gigabit Ethernet Interface

2006-03-13 Thread Brian Barrett
On Mar 13, 2006, at 8:38 AM, Michael Kluskens wrote: On Mar 11, 2006, at 1:00 PM, Jayabrata Chakrabarty wrote: Hi I have been looking for information on how to use multiple Gigabit Ethernet Interface for MPI communication. So far what i have found out is i have to use mca_btl_tcp. But what i

Re: [OMPI users] problems with OpenMPI-1.0.1 on SunOS 5.9; problems on heterogeneous cluster

2006-03-13 Thread Brian Barrett
Hi Ravi - With the help of another Open MPI user, I spent the weekend finding a couple of issues with Open MPI on Solaris. I believe you are running into the same problems. We're in the process of certifying the changes for release as part of 1.0.2, but it's Monday morning and the relea

Re: [OMPI users] Open MPI and MultiRail InfiniBand

2006-03-13 Thread Galen Shipman
It looks like we never added similar logic to the Open IB transport. I'll pass your request on to the developer of our Open IB transport. Given our timeframe for releasing Open MPI 1.0.2, it's doubtful any change will make that release. But it should definitely be possible to add such functionali

Re: [OMPI users] Run failure on Solaris Opteron with Sun Studio 11

2006-03-13 Thread Brian Barrett
On Mar 9, 2006, at 12:18 PM, Pierre Valiron wrote: - Configure and compile are okay Good to hear. - However compiling the mpi.f90 takes over 35 *minutes* with -O1. This seems a bit excessive... I tried removing any -O option and things are just as slow. Is this behaviour related to open

Re: [OMPI users] Open MPI and MultiRail InfiniBand

2006-03-13 Thread Troy Telford
On Mon, 13 Mar 2006 07:37:10 -0700, Galen Shipman wrote: It looks like we never added similar logic to the Open IB transport. I'll pass your request on to the developer of our Open IB transport. Given our timeframe for releasing Open MPI 1.0.2, it's doubtful any change will make that release.

Re: [OMPI users] Open MPI and MultiRail InfiniBand

2006-03-13 Thread Galen Shipman
This was my oversight, I am getting to it know, should have something in just a bit. - Galen I can live with that, certainly. Fortunately, there's a couple months until I have a real /need/ for this. -- Hi Troy, I have added max_btls to the openib component on the trunk, try: mpirun --mca

Re: [OMPI users] Open MPI and MultiRail InfiniBand

2006-03-13 Thread Jean-Christophe Hugly
On Mon, 2006-03-13 at 10:57 -0700, Galen Shipman wrote: > >> This was my oversight, I am getting to it know, should have something > >> in just a bit. > >> > >> - Galen > > > > I can live with that, certainly. Fortunately, there's a couple months > > until I have a real /need/ for this. > > -- >

Re: [OMPI users] Run failure on Solaris Opteron with Sun Studio 11

2006-03-13 Thread Pierre Valiron
Brian Barrett wrote: b) whatever the code was compiled with mpif77 or mpif90, execution fails: valiron@icare ~/BENCHES > mpirun -np 2 all Signal:11 info.si_errno:0(Error 0) si_code:1(SEGV_MAPERR) Failing at addr:40 *** End of error message *** Signal:11 info.si_errno:0(Error 0) si_code:1(SEGV_

Re: [OMPI users] Open MPI and MultiRail InfiniBand

2006-03-13 Thread Troy Telford
I have added max_btls to the openib component on the trunk, try: mpirun --mca btl_openib_max_btls 1 ...etc I don't have a dual nic machine handy to test on, if this checks out we can patch the release branch. Thanks, Galen I'll get to it as soon as I can; but it may be a few days. -- Troy Te

Re: [OMPI users] Success on Solaris Opteron with Sun Studio 11

2006-03-13 Thread Pierre Valiron
I have successfully build openmpi-1.1a1r9260 (from the subversion trunk) in 64-bit mode on Solaris Opteron. This r9260 tarball incorporates the last patches for Solaris from Brian Barrett. In order to accelerate the build I disabled the f90 bindings. My build script is as follows: #! /bin/tc

Re: [OMPI users] Success on Solaris Opteron with Sun Studio 11

2006-03-13 Thread Brian Barrett
On Mar 13, 2006, at 4:36 PM, Pierre Valiron wrote: I have successfully build openmpi-1.1a1r9260 (from the subversion trunk) in 64-bit mode on Solaris Opteron. This r9260 tarball incorporates the last patches for Solaris from Brian Barrett. Just a quick note - these changes were recently m