Hi,
Am 27.06.2008 um 20:00 schrieb Azhar Ali Shah:
Using openmpi-1.2.6 with SGE 6.1u3 when I submit a job script
containing:
mpirun -n $NSLOTS -machinefile $TMPDIR/machines ~/openmpi_test
it produces following error in the output file:
cat: /tmp/174.1.all.q/machines: No such file or directo
I guess you just have to make sure the machine file you're trying to
use is there. What "ls -l $TMPDIR/machines" is showing ?
george.
On Jun 27, 2008, at 2:00 PM, Azhar Ali Shah wrote:
Hi,
Using openmpi-1.2.6 with SGE 6.1u3 when I submit a job script
containing:
mpirun -n $NSLOTS -mach
Hi,
Using openmpi-1.2.6 with SGE 6.1u3 when I submit a job script containing:
mpirun -n $NSLOTS -machinefile $TMPDIR/machines ~/openmpi_test
it produces following error in the output file:
cat: /tmp/174.1.all.q/machines: No such file or directory
---
Thanks Rainer and Matt, your suggestions solved my problem.
On Fri, Jun 27, 2008 at 11:44 AM, Matt Hughes
wrote:
> 2008/6/27 Joao Marcelo :
>> Hi,
>>
>> I'm starting to code with MPI and decided to use openmpi. I'm using
>> Ubuntu Linux with GCC version 4.2.3 and OpenMPI 1.2.5 (distribution
>> p
2008/6/27 Joao Marcelo :
> Hi,
>
> I'm starting to code with MPI and decided to use openmpi. I'm using
> Ubuntu Linux with GCC version 4.2.3 and OpenMPI 1.2.5 (distribution
> package). The output of "ompi_info -- all" is attached. I'm also
> sending a copy of the source code I'm trying to run.
On
Dear Joao,
the problem is that, You do not use the &reqs[i] correctly in the MPI_Send --
reqs[0] will not be initialized; but wait for in MPI_Waitall...
Change:
rc = MPI_Isend(&a , 1 , MPI_INT , i , 0 , MPI_COMM_WORLD , &reqs[i]);
to
rc = MPI_Isend(&a , 1 , MPI_INT , i , 0 , MPI_COMM_WORLD ,
Hi,
I'm starting to code with MPI and decided to use openmpi. I'm using
Ubuntu Linux with GCC version 4.2.3 and OpenMPI 1.2.5 (distribution
package). The output of "ompi_info -- all" is attached. I'm also
sending a copy of the source code I'm trying to run.
What I'm trying to do is selecting pro
Hi, I am trying to use the latest release of v1.3 to test with BLCR
however i just noticed that sometime after 1.3a1r18423 the standard
mpich sample code (cpi.c) stopped working on our rel4 based myrinet
gm clusters which raises some concern.
Please find attached: gm_board_info.out, ompi_info--a