Re: [OMPI users] scaling problem with openmpi

2009-05-15 Thread Gus Correa
Hi Roman I googled out and found that CPMD is a molecular dynamics program. (What would be of civilization without Google?) Unfortunately I kind of wiped off from my mind Schrodinger's equation, Quantum Mechanics, and the Born approximation, which I learned probably before you were born. I could

Re: [OMPI users] scaling problem with openmpi

2009-05-15 Thread Gus Correa
Hi Roman Just a guess. Is this a domain decomposition code? (I never heard about "cpmd 32 waters" before, sorry.) Is it based on finite differences, finite volume, finite element? If it is, once the size of the subdomains becomes too small compared to the size of the halo around them, the overhe

[OMPI users] scaling problem with openmpi

2009-05-15 Thread Roman Martonak
Hello, I observe very poor scaling with openmpi on HP blade system consisting of 8 blades (each having 2 quad-core AMD Barcelona 2.2 GHz CPU) and interconnected with Infiniband fabric. When running the standard cpmd 32 waters test, I observe the following scaling (the numbers are elapsed time) op

Re: [OMPI users] OpenMPI 1.3.2 with PathScale 3.2

2009-05-15 Thread Jeff Squyres
FWIW, I'm able to duplicate the error. Looks definitely like a[nother] pathscale bug to me. Perhaps David's suggestions to disable some of the optimizations may help; otherwise, you can disable that entire chunk of code with the following: --enable-contrib-no-build=vt (as Ralph menti

Re: [OMPI users] OpenMPI 1.3.2 with PathScale 3.2

2009-05-15 Thread David O. Gunter
Pathscale supports -O3 (at least as of the 3.1 line). Here are some suggestions from the 3.2 Users Manual you may also want to try. -david If there are numerical problems with -O3 -OPT:Ofast, then try either of the following: -O3 -OPT:Ofast:ro=1 -O3 -OPT:Ofast:div_split=OFF Note that ’ro’

Re: [OMPI users] OpenMPI deadlocks and race conditions ?

2009-05-15 Thread François PELLEGRINI
Bonjour Eugene, users-requ...@open-mpi.org wrote: > Date: Thu, 14 May 2009 17:06:07 -0700 > From: Eugene Loh > Subject: Re: [OMPI users] OpenMPI deadlocks and race conditions ? > To: Open MPI Users > Message-ID: <4a0cb1ef.5050...@sun.com> > Content-Type: text/plain; format=flowed; charset=ISO-8

Re: [OMPI users] Problem installing Dalton with OpenMPI over PelicanHPC

2009-05-15 Thread Silviu Groza
Hi, I still not solved these errors. I need help in order to install Dalton quantum with OpenMPI. Thank you. ---> Linking sequential dalton.x ... mpif77.openmpi -march=x86-64 -O3 -ffast-math -fexpensive-optimizations -funroll-loops -fno-range-check -fsecond-underscore \     -o /r