On Feb 12, 2009, at 2:00 PM, jody wrote:

In my application i use MPI_PROC_NULL
as an argument in MPI_Sendrecv to simplify the
program (i.e. no special cases for borders)
With 1.3 it works, but under 1.3.1a0r20520
i get the following error:
[jody@localhost 3D]$ mpirun -np 2 ./sr
[localhost.localdomain:29253] *** An error occurred in MPI_Sendrecv
[localhost.localdomain:29253] *** on communicator MPI_COMM_WORLD
[localhost.localdomain:29253] *** MPI_ERR_RANK: invalid rank
[localhost.localdomain:29253] *** MPI_ERRORS_ARE_FATAL (goodbye)
[localhost.localdomain:29252] *** An error occurred in MPI_Sendrecv
[localhost.localdomain:29252] *** on communicator MPI_COMM_WORLD
[localhost.localdomain:29252] *** MPI_ERR_RANK: invalid rank
[localhost.localdomain:29252] *** MPI_ERRORS_ARE_FATAL (goodbye)

Your program as written should hang, right? You're trying to receive from MCW rank 1 and no process is sending.

I slightly modified your code:

#include <stdio.h>
#include "mpi.h"

int main() {
    int iRank;
    int iSize;
    MPI_Status st;

    MPI_Init(NULL, NULL);
    MPI_Comm_size(MPI_COMM_WORLD, &iSize);
    MPI_Comm_rank(MPI_COMM_WORLD, &iRank);

    if (1 == iRank) {
        MPI_Send(&iSize, 1, MPI_INT, 0, 77, MPI_COMM_WORLD);
    } else if (0 == iRank) {
        MPI_Sendrecv(&iRank, 1, MPI_INT, MPI_PROC_NULL, 77,
                     &iSize, 1, MPI_INT, 1, 77, MPI_COMM_WORLD, &st);
    }

    MPI_Finalize();
    return 0;
}

And that works fine for me at the head of the v1.3 branch:

[16:17] svbu-mpi:~/svn/ompi-1.3 % svnversion .
20538

We did have a few bad commits on the v1.3 branch recently; could you try with a tarball from tonight, perchance?

--
Jeff Squyres
Cisco Systems

Reply via email to