Re: [OMPI users] OMPI 3.0.0 crashing at mpi_init on OS X using Fortran [FIXED]

2017-12-14 Thread Ricardo Fonseca
pi_init.c:486 >>> frame #2: 0x000100eb3f38 >>> libmpi.40.dylib`PMPI_Init(argc=0x7ffeefbfe2d0, argv=0x7ffeefbfe2c8) >>> at pinit.c:66 >>> frame #3: 0x000100cceb0b >>> libmpi_mpifh.40.dylib`ompi_init_f(ierr=0x7ffeefbfe9f8) at init_f.c:84 >

[OMPI users] Help with multicore AMD machine performance

2012-03-30 Thread Ricardo Fonseca
16K L1i cache: 64K L2 cache: 2048K L3 cache: 6144K NUMA node0 CPU(s): 0,2,4,6,8,10,12,14 NUMA node1 CPU(s): 16,18,20,22,24,26,28,30 NUMA node2 CPU(s): 1,3,5,7,9,11,13,15 NUMA node3 CPU(s): 17,19,21,23,25,27,29,31 --- Ricardo Fonseca Associate

Re: [OMPI users] OMPI users] MPI_IN_PLACE in Fortran withMPI_REDUCE / MPI_ALLREDUCE

2009-08-04 Thread Ricardo Fonseca
nks again for your help, Ricardo --- Prof. Ricardo Fonseca GoLP - Grupo de Lasers e Plasmas Instituto de Plasmas e Fusão Nuclear Instituto Superior Técnico Av. Rovisco Pais 1049-001 Lisboa Portugal tel: +351 21 8419202 fax: +351 21 8464455 web: http://cfp.ist.utl.pt/golp/ On Aug 1, 2009, at 17:

Re: [OMPI users] OMPI users] MPI_IN_PLACE in Fortran withMPI_REDUCE / MPI_ALLREDUCE

2009-07-30 Thread Ricardo Fonseca
e compiler always finds that one instead of the MPI-implementation- provided mpif.h.). On Jul 28, 2009, at 1:17 PM, Ricardo Fonseca wrote: Hi George I did some extra digging and found that (for some reason) the MPI_IN_PLACE parameter is not being recognized as such by mpi_reduce_f (reduce_f.c:61). I adde

Re: [OMPI users] users Digest, Vol 1302, Issue 1

2009-07-29 Thread Ricardo Fonseca
s finds that one instead of the MPI-implementation- provided mpif.h.). On Jul 28, 2009, at 1:17 PM, Ricardo Fonseca wrote:

Re: [OMPI users] OMPI users] MPI_IN_PLACE in Fortran with MPI_REDUCE / MPI_ALLREDUCE

2009-07-28 Thread Ricardo Fonseca
he wrong address. Could this be related to the fortran compilers I'm using (ifort / g95)? Ricardo --- Prof. Ricardo Fonseca GoLP - Grupo de Lasers e Plasmas Instituto de Plasmas e Fusão Nuclear Instituto Superior Técnico Av. Rovisco Pais 1049-001 Lisboa Portugal tel: +351 21 8419202

Re: [OMPI users] OMPI users] MPI_IN_PLACE in Fortran with MPI_REDUCE / MPI_ALLREDUCE

2009-07-28 Thread Ricardo Fonseca
Hi George I don't think this is a library mismatch. I just followed your instructions and got: $ otool -L a.out a.out: /opt/openmpi/1.3.3-g95-32/lib/libmpi_f77.0.dylib (compatibility version 1.0.0, current version 1.0.0) /opt/openmpi/1.3.3-g95-32/lib/libmpi.0.dylib (compatibility version

Re: [OMPI users] OMPI users] MPI_IN_PLACE in Fortran with MPI_REDUCE / MPI_ALLREDUCE

2009-07-28 Thread Ricardo Fonseca
f some --mca options I should try? (or any other ideas...) Cheers, Ricardo --- Prof. Ricardo Fonseca GoLP - Grupo de Lasers e Plasmas Instituto de Plasmas e Fusão Nuclear Instituto Superior Técnico Av. Rovisco Pais 1049-001 Lisboa Portugal tel: +351 21 8419202 fax: +351 21 8464455 web: http:/

[OMPI users] MPI_IN_PLACE in Fortran with MPI_REDUCE / MPI_ALLREDUCE

2009-07-27 Thread Ricardo Fonseca
if ( rank == 0 ) then print *, 'Result:' print *, buffer endif rc = 0 call mpi_finalize( rc ) end program --- Any ideas? Cheers, Ricardo --- Prof. Ricardo Fonseca GoLP - Grupo de Lasers e Plasmas Instituto de Plasmas e Fusão Nuclear Instituto Superior Técnico Av. R

[OMPI users] Problems compiling openmpi 1.2 under AIX 5.2

2007-03-23 Thread Ricardo Fonseca
mpirun was unable to cleanly terminate the daemons for this job. Returned value Not implemented instead of ORTE_SUCCESS. at the job end. Keep up the good work, cheers, Ricardo --- Prof. Ricardo Fonseca GoLP - Grupo de Lasers e Plasmas Centro de Física dos Plasmas Instituto Superior Técnico Av.