Thank you for your response.
I attached the results of using :valgrind --tool=memcheck --leak-check=yes
-v

valgrind --tool=memcheck --leak-check=yes -v  mpirun -np 8  -mca btl
vader,self,tcp -mca btl_tcp_eager_limit 4095 -x
LD_LIBRARY_PATH   ./chimere.e

‫في الجمعة، 28 سبتمبر 2018 في 7:03 م تمت كتابة ما يلي بواسطة ‪Ralph H
Castain‬‏ <‪r...@open-mpi.org‬‏>:‬

> Ummm…looks like you have a problem in your input deck to that application.
> Not sure what we can say about it…
>
>
> > On Sep 28, 2018, at 9:47 AM, Zeinab Salah <zeinabsa...@gmail.com> wrote:
> >
> > Hi everyone,
> > I use openmpi-3.0.2 and I want to run chimere model with 8 processors,
> but in the step of parallel mode, the run stopped with the following error
> message,
> > Please could you help me?
> > Thank you in advance
> > Zeinab
> >
> >      +++ CHIMERE RUNNING IN PARALLEL MODE +++
> >               MPI SUB-DOMAINS :
> > rank  izstart  izend  nzcount  imstart imend  nmcount     i       j
> > --------------------------------------------------------------------
> >    1       1      14      14       1      22      22       1       1
> >    2      15      27      13       1      22      22       2       1
> >    3      28      40      13       1      22      22       3       1
> >    4      41      53      13       1      22      22       4       1
> >    5       1      14      14      23      43      21       1       2
> >    6      15      27      13      23      43      21       2       2
> >    7      28      40      13      23      43      21       3       2
> >    8      41      53      13      23      43      21       4       2
> >  Sub domain dimensions:           14          22
> >
> >  boundary conditions:
> /home/dream/CHIMERE/chimere2017r4/../BIGFILES/OUTPUTS/Test/../INIBOUN.10/BOUN_CONCS.2009030700_2009030900_Test.list
> >            3  boundary conditions file(s) found
> >  Opening
> /home/dream/CHIMERE/chimere2017r4/../BIGFILES/OUTPUTS/Test/../INIBOUN.10/BOUN_CONCS.2009030700_2009030900_Test.nc-gas
> >  Opening
> /home/dream/CHIMERE/chimere2017r4/../BIGFILES/OUTPUTS/Test/../INIBOUN.10/BOUN_CONCS.2009030700_2009030900_Test.nc-aer
> >  Opening
> /home/dream/CHIMERE/chimere2017r4/../BIGFILES/OUTPUTS/Test/../INIBOUN.10/BOUN_CONCS.2009030700_2009030900_Test.nc-dust
> >  Opening
> /home/dream/CHIMERE/chimere2017r4/../BIGFILES/OUTPUTS/Test/../INIBOUN.10/BOUN_CONCS.2009030700_2009030900_Test.nc-gas
> >  Opening
> /home/dream/CHIMERE/chimere2017r4/../BIGFILES/OUTPUTS/Test/../INIBOUN.10/BOUN_CONCS.2009030700_2009030900_Test.nc-aer
> >  Opening
> /home/dream/CHIMERE/chimere2017r4/../BIGFILES/OUTPUTS/Test/../INIBOUN.10/BOUN_CONCS.2009030700_2009030900_Test.nc-dust
> > -------------------------------------------------------
> > Primary job  terminated normally, but 1 process returned
> > a non-zero exit code. Per user-direction, the job has been aborted.
> > -------------------------------------------------------
> >
> --------------------------------------------------------------------------
> > mpirun noticed that process rank 5 with PID 0 on node localhost exited
> on signal 9 (Killed).
> >
> --------------------------------------------------------------------------
> >
> > real  3m51.733s
> > user  0m5.044s
> > sys   1m8.617s
> > Abnormal termination of step2.sh
> >
> > _______________________________________________
> > users mailing list
> > users@lists.open-mpi.org
> > https://lists.open-mpi.org/mailman/listinfo/users
>
> _______________________________________________
> users mailing list
> users@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users

<<attachment: log.zip>>

_______________________________________________
users mailing list
users@lists.open-mpi.org
https://lists.open-mpi.org/mailman/listinfo/users

Reply via email to