darre...@ece.ubc.ca wrote:
Dear GROMACS Gurus,
I am experiencing a segmentation fault when mdrun executes. My simulation
has a graphene lattice with an array (layer) of ammonia molecules above
it. The box is three times the width of the graphene lattice, three
times the length of the graphene lattice, and three times the height
between the graphene lattice and the ammonia molecules. I am including
the mdp file and the error message.

Probably your system is exploding when integration fails with excessive forces. You should look at the bottom of stdout, stderr, *and* the .log file to diagnose. The error message you give below is merely the diagnostic trace from the MPI library, and it not useful for finding out what GROMACS thinks the problem might be. Further advice below.

***************************************************************************
.mdp file
title           =FWS
;warnings       =10
cpp             =cpp
;define         =-DPOSRES
;constraints    =all-bonds
integrator      =md
dt              =0.002 ; ps
nsteps          =100000
nstcomm         =1000
nstxout         =1000
;nstvout                =1000
nstfout         =0
nstlog          =1000
nstenergy       =1000
nstlist         =1000
ns_type         =grid
rlist           =2.0
coulombtype     =PME
rcoulomb        =2.0
vdwtype         =cut-off
rvdw            =5.0
fourierspacing  =0.12
fourier_nx      =0
fourier_ny      =0
fourier_nz      =0
pme_order       =4
ewald_rtol      =1e-5
optimize_fft    =yes

; This section added in to freeze hydrogen atoms at edge of graphene
lattice to prevent movement of lattice
;energygrp_excl = Edge Edge Edge Grph Grph Grph
freezegrps      = Edge Grph ; Hydrogen atoms in graphene lattice are
associated with the residue Edge

See comments in 7.3.24 of manual. You need the energy group exclusions.

Mark

freezedim       = Y Y Y Y Y Y; Freeze hydrogen atoms in all directions

;Tcoupl         =berendsen
;tau_t          =0.1    0.1
;tc-grps                =protein non-protein
;ref_t = 300 300

;Pcoupl = parrinello-rahman
;tau_p = 0.5
;compressibility = 4.5e-5
;ref_p = 1.0

;gen_vel = yes
;gen_temp = 300.0
;gen_seed = 173529
***************************************************************************

***************************************************************************
ERROR IN OUTPUT FILE
[node16:25758] *** Process received signal ***
[node16:25758] Signal: Segmentation fault (11)
[node16:25758] Signal code: Address not mapped (1)
[node16:25758] Failing at address: 0xfffffffe1233e230
[node16:25758] [ 0] /lib64/libpthread.so.0 [0x3834a0de80]
[node16:25758] [ 1] /usr/lib64/libmd_mpi.so.4(pme_calc_pidx+0xd6)
[0x2ba295dd0606]
[node16:25758] [ 2] /usr/lib64/libmd_mpi.so.4(do_pme+0x808)
[0x2ba295dd4058]
[node16:25758] [ 3] /usr/lib64/libmd_mpi.so.4(force+0x8de)
[0x2ba295dba5be]
[node16:25758] [ 4] /usr/lib64/libmd_mpi.so.4(do_force+0x5ef)
[0x2ba295ddeaff]
[node16:25758] [ 5] mdrun_mpi(do_md+0xe23) [0x411193]
[node16:25758] [ 6] mdrun_mpi(mdrunner+0xd40) [0x4142f0]
[node16:25758] [ 7] mdrun_mpi(main+0x239) [0x4146f9]
[node16:25758] [ 8] /lib64/libc.so.6(__libc_start_main+0xf4)
[0x3833e1d8b4]
[node16:25758] [ 9] mdrun_mpi [0x40429a]
[node16:25758] *** End of error message ***
mpirun noticed that job rank 7 with PID 25758 on node node16 exited on
signal 11 (Segmentation fault).
7 processes killed (possibly by Open MPI)
***************************************************************************

Could you please let me know what you think may be causing the fault?

Much thanks in advance.

Darrell Koskinen
_______________________________________________
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

_______________________________________________
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Reply via email to