ry
Mar 20 11:45:27 r1n101 slurmstepd[53842]: job 1618 completed with
slurm_rc = 0, job_rc = 256
Thanks.
--
Andrés Marín Díaz
Servicio de Infraestructura e Innovación
Universidad Politécnica de Madrid
Centro de Supercomputación y Visualización de Madrid (CeSViMa)
Campus de Monteganc
srun and mpirun. But if it is launched to nodes with slurm 19.05 it
works with srun but it fails with mpirun.
Can it be a bug in the new version?
Thank you.
--
Andrés Marín Díaz
Servicio de Infraestructura e Innovación
Universidad Politécnica de Madrid
Centro de Supercomputación y
r1n2 slurmstepd[84954]:
_oom_event_monitor: oom-kill event count: 1
2019-06-06T09:51:56.638334+00:00 r1n2 slurmstepd[84954]: done with job
Thank you very much again.
--
Andrés Marín Díaz
Servicio de Infraestructura e Innovación
Universidad Politécnica de Madrid
Centro de Sup
Physics
University of Melbourne
On Thu, 6 Jun 2019 at 20:11, Andrés Marín Díaz <mailto:ama...@cesvima.upm.es>> wrote:
Thank you very much for the help, I update some information.
- If we use Intel MPI (IMPI) mpirun it works correctly.
- If we use mpirun without using the sch
Ix?
Sean
--
Sean Crosby
Senior DevOpsHPC Engineer and HPC Team Lead | Research Platform Services
Research Computing | CoEPP | School of Physics
University of Melbourne
On Thu, 6 Jun 2019 at 21:11, Andrés Marín Díaz <mailto:ama...@cesvima.upm.es>> wrote:
Hello,
Yes, we have reco
enMPI?
Thank you very much again.
--
Andrés Marín Díaz
Servicio de Infraestructura e Innovación
Universidad Politécnica de Madrid
Centro de Supercomputación y Visualización de Madrid (CeSViMa)
Campus de Montegancedo. 28223, Pozuelo de Alarcón, Madrid (ES)
ama...@cesvima.upm.es | tel 9106
Hello, I have already applied the patch and recompiled and everything
works correctly.
Now to wait for the 19.05.1.
Thank you.
--
Andrés Marín Díaz
Servicio de Infraestructura e Innovación
Universidad Politécnica de Madrid
Centro de Supercomputación y Visualización de Madrid