Re: [OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-24 Thread Allin Cottrell
On Wed, 24 Sep 2014, Ralph Castain wrote: As I said, we removed the warning starting in 1.8.3 On Sep 24, 2014, at 1:23 PM, Brock Palen wrote: So very hetero, I did some testing and I couldn't make it happen below 32 cores. Not sure if this the real issue [...] Just to amplify Ralph's resp

Re: [OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-24 Thread Ralph Castain
As I said, we removed the warning starting in 1.8.3 On Sep 24, 2014, at 1:23 PM, Brock Palen wrote: > So very hetero, I did some testing and I couldn't make it happen below 32 > cores. Not sure if this the real issue or if it requires a specific layout: > > [brockp@nyx5512 ~]$ cat $PBS_NODEFI

Re: [OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-24 Thread Brock Palen
So very hetero, I did some testing and I couldn't make it happen below 32 cores. Not sure if this the real issue or if it requires a specific layout: [brockp@nyx5512 ~]$ cat $PBS_NODEFILE | sort | uniq -c 1 nyx5512 1 nyx5515 1 nyx5518 1 nyx5523 1 nyx5527 2 nyx

Re: [OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-23 Thread Maxime Boissonneault
Do you know the topology of the cores allocated by Torque (i.e. were they all on the same nodes, or 8 per node, or a heterogenous distribution for example ?) Le 2014-09-23 15:05, Brock Palen a écrit : Yes the request to torque was procs=64, We are using cpusets. the mpirun without -np 64 c

Re: [OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-23 Thread Brock Palen
Yes the request to torque was procs=64, We are using cpusets. the mpirun without -np 64 creates 64 spawned hostnames. Brock Palen www.umich.edu/~brockp CAEN Advanced Computing XSEDE Campus Champion bro...@umich.edu (734)936-1985 On Sep 23, 2014, at 3:02 PM, Ralph Castain wrote: > FWIW: th

Re: [OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-23 Thread Ralph Castain
FWIW: that warning has been removed from the upcoming 1.8.3 release On Sep 23, 2014, at 11:45 AM, Reuti wrote: > -BEGIN PGP SIGNED MESSAGE- > Hash: SHA1 > > Am 23.09.2014 um 19:53 schrieb Brock Palen: > >> I found a fun head scratcher, with openmpi 1.8.2 with torque 5 built with >>

Re: [OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-23 Thread Maxime Boissonneault
Hi, Just an idea here. Do you use cpusets within Torque ? Did you request enough cores to torque ? Maxime Boissonneault Le 2014-09-23 13:53, Brock Palen a écrit : I found a fun head scratcher, with openmpi 1.8.2 with torque 5 built with TM support, on hereto core layouts I get the fun thin

Re: [OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-23 Thread Reuti
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Am 23.09.2014 um 19:53 schrieb Brock Palen: > I found a fun head scratcher, with openmpi 1.8.2 with torque 5 built with TM > support, on hereto core layouts I get the fun thing: > mpirun -report-bindings hostname< Works And you get

[OMPI users] Strange affinity messages with 1.8 and torque 5

2014-09-23 Thread Brock Palen
I found a fun head scratcher, with openmpi 1.8.2 with torque 5 built with TM support, on hereto core layouts I get the fun thing: mpirun -report-bindings hostname< Works mpirun -report-bindings -np 64 hostname <- Wat?