Can you confirm that you're using the right mpif.h?

Keep in mind that each MPI implementation's mpif.h is different -- it's a common mistake to assume that the mpif.h from one MPI implementation should work with another implementation (e.g., someone copied mpif.h from one MPI to your software's source tree, so the compiler always finds that one instead of the MPI-implementation- provided mpif.h.).


On Jul 28, 2009, at 1:17 PM, Ricardo Fonseca wrote:

Hi George

I did some extra digging and found that (for some reason) the MPI_IN_PLACE parameter is not being recognized as such by mpi_reduce_f (reduce_f.c:61). I added a couple of printfs:

    printf(" sendbuf = %p \n", sendbuf );

    printf(" MPI_FORTRAN_IN_PLACE = %p \n", &MPI_FORTRAN_IN_PLACE );
    printf(" mpi_fortran_in_place = %p \n", &mpi_fortran_in_place );
    printf(" mpi_fortran_in_place_ = %p \n", &mpi_fortran_in_place_ );
printf(" mpi_fortran_in_place__ = %p \n", &mpi_fortran_in_place__ );

And this is what I get on node 0:

 sendbuf = 0x50920
 MPI_FORTRAN_IN_PLACE = 0x17cd30
 mpi_fortran_in_place = 0x17cd34
 mpi_fortran_in_place_ = 0x17cd38
 mpi_fortran_in_place__ = 0x17cd3c

This makes OMPI_F2C_IN_PLACE(sendbuf) fail. If I replace the line:

sendbuf = OMPI_F2C_IN_PLACE(sendbuf);

with:

    if ( sendbuf == 0x50920 ) {
      printf("sendbuf is MPI_IN_PLACE!\n");
      sendbuf = MPI_IN_PLACE;
    }

Then the code works and gives the correct result:

sendbuf is MPI_IN_PLACE!
 Result:
 3. 3. 3. 3.

So my guess is that somehow the MPI_IN_PLACE constant for fortran is getting the wrong address. Could this be related to the fortran compilers I'm using (ifort / g95)?

Ricardo

---
Prof. Ricardo Fonseca

GoLP - Grupo de Lasers e Plasmas
Instituto de Plasmas e Fusão Nuclear
Instituto Superior Técnico
Av. Rovisco Pais
1049-001 Lisboa
Portugal

tel: +351 21 8419202
fax: +351 21 8464455
web: http://cfp.ist.utl.pt/golp/

On Jul 28, 2009, at 17:00 , users-requ...@open-mpi.org wrote:

Message: 1
Date: Tue, 28 Jul 2009 11:16:34 -0400
From: George Bosilca <bosi...@eecs.utk.edu>
Subject: Re: [OMPI users] OMPI users] MPI_IN_PLACE in Fortran with
        MPI_REDUCE / MPI_ALLREDUCE
To: Open MPI Users <us...@open-mpi.org>
Message-ID: <c0f59401-0a63-4eb8-804b-51d290712...@eecs.utk.edu>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed; delsp=yes

_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users


--
Jeff Squyres
jsquy...@cisco.com


Reply via email to