The energy minimization went on without any problem on 4 processors but the problem occurs when I perform the MD run. Also, I did not get any error message with relevance to LINCS etc... JJ
On Wed, Jun 24, 2009 at 6:53 PM, Justin A. Lemkul <jalem...@vt.edu> wrote: > > > jayant james wrote: > >> Yes my distance restraints are long because I am using FRET distances as >> distance restraints while performing MD simulations. Upon usage of this >> command >> >> mpirun -np 4 mdrun_mpi -s pr -e pr -g md -o traj.trr -c pr.gro -pd >> & >> >> I get the following error!! I did try giving yes after -pd but even then >> the same error message is repeated. >> >> > Anything else printed to the screen or log file? LINCS warnings or > anything else? Did energy minimization complete successfully? > > -Justin > > Back Off! I just backed up md.log to ./#md.log.9# >> Reading file pr.tpr, VERSION 4.0 (single precision) >> NNODES=4, MYRANK=1, HOSTNAME=localhost.localdomain >> NODEID=1 argc=13 >> NODEID=3 argc=13 >> [localhost:17514] *** Process received signal *** >> [localhost:17514] Signal: Segmentation fault (11) >> [localhost:17514] Signal code: Address not mapped (1) >> [localhost:17514] Failing at address: 0x1340000 >> [localhost:17514] [ 0] /lib64/libpthread.so.0 [0x38dec0f0f0] >> [localhost:17514] [ 1] /lib64/libc.so.6(memcpy+0x15b) [0x38de08432b] >> [localhost:17514] [ 2] >> /usr/lib64/openmpi/1.2.4-gcc/libmpi.so.0(ompi_convertor_pack+0x152) >> [0x3886e45392] >> [localhost:17514] [ 3] >> /usr/lib64/openmpi/1.2.4-gcc/openmpi/mca_btl_sm.so(mca_btl_sm_prepare_src+0x13d) >> [0x7f39dd118a4d] >> [localhost:17514] [ 4] >> /usr/lib64/openmpi/1.2.4-gcc/openmpi/mca_pml_ob1.so(mca_pml_ob1_send_request_start_rndv+0x140) >> [0x7f39dd735230] >> [localhost:17514] [ 5] >> /usr/lib64/openmpi/1.2.4-gcc/openmpi/mca_pml_ob1.so(mca_pml_ob1_send+0x748) >> [0x7f39dd72e508] >> [localhost:17514] [ 6] >> /usr/lib64/openmpi/1.2.4-gcc/openmpi/mca_coll_tuned.so(ompi_coll_tuned_bcast_intra_split_bintree+0x91c) >> [0x7f39dc6f735c] >> [localhost:17514] [ 7] >> /usr/lib64/openmpi/1.2.4-gcc/libmpi.so.0(MPI_Bcast+0x15c) [0x3886e4c40c] >> [localhost:17514] [ 8] mdrun_mpi(bcast_state+0x26c) [0x56d59c] >> [localhost:17514] [ 9] mdrun_mpi(mdrunner+0x1067) [0x42b807] >> [localhost:17514] [10] mdrun_mpi(main+0x3b4) [0x431c34] >> [localhost:17514] [11] /lib64/libc.so.6(__libc_start_main+0xe6) >> [0x38de01e576] >> [localhost:17514] [12] mdrun_mpi [0x413339] >> [localhost:17514] *** End of error message *** >> mpirun noticed that job rank 0 with PID 17514 on node >> localhost.localdomain exited on signal 11 (Segmentation fault). >> 3 additional processes aborted (not shown) >> >> >> >> On Wed, Jun 24, 2009 at 6:04 PM, Justin A. Lemkul <jalem...@vt.edu<mailto: >> jalem...@vt.edu>> wrote: >> >> >> >> jayant james wrote: >> >> >> I just replaced the old gmx 4.0 version with the 4.0.5 version >> and still the same problem >> >> NOTE: atoms involved in distance restraints should be within the >> longest cut-off distance, if this is not the case mdrun >> generates a fatal error, in that case use particle decomposition >> (mdrun option -pd) >> >> Well, does it work with -pd? It looks like your distance restraints >> are indeed quite long, so this looks like it is your only option. >> >> -Justin >> >> >> >> WARNING: Can not write distance restraint data to energy file >> with domain decomposition >> >> >> >> On Wed, Jun 24, 2009 at 5:10 PM, Justin A. Lemkul >> <jalem...@vt.edu <mailto:jalem...@vt.edu> >> <mailto:jalem...@vt.edu <mailto:jalem...@vt.edu>>> wrote: >> >> >> >> jayant james wrote: >> >> Hi! >> I am performing an mpi MD (on a quad core system) run with >> distance restraints. When I execute this command below >> without >> position restraints the MD run is distributed over 4 nodes >> perfectly well. But when I incorporate the distance >> restraints I >> hit a road block >> mpirun -np 4 mdrun_mpi -s pr -e pr -g md -o traj.trr >> -c pr.gro & >> I get this error message (below). My pr.mdp and distance >> restraints files are given below the error message >> . * >> Question.* How do I handle this sitation? Do I increase >> the long >> range cut off in the pr.mdp file? If you see my distance >> restraints file, my upper range of distances are close to >> 9nm!! >> >> >> Upgrade to the latest version (4.0.5), since there have been >> numerous improvements to domain decomposition throughout the >> development of version 4.0. >> >> -Justin >> >> Please guide. >> Thanks >> JJ >> >> ---------------------------------------------------------------------------------------------------------------------------------------------------------- >> Back Off! I just backed up md.log to ./#md.log.6# >> Reading file pr.tpr, VERSION 4.0 (single precision) >> >> NOTE: atoms involved in distance restraints should be >> within the >> longest cut-off distance, if this is not the case mdrun >> generates a fatal error, in that case use particle >> decomposition >> (mdrun option -pd) >> >> >> WARNING: Can not write distance restraint data to energy >> file >> with domain decomposition >> >> ------------------------------------------------------- >> Program mdrun_mpi, VERSION 4.0.2 Source >> code file: domdec.c, line: 5842 >> Fatal error: >> There is no domain decomposition for 4 nodes that is >> compatible >> with the given box and a minimum cell size of 9.85926 nm >> Change the number of nodes or mdrun option -rdd or -dds >> >> Look in >> the log file for details on the domain decomposition >> >> >> ----------------------------------------------------------------------------------------------------------------------------------------------- >> >> *pr.mdp* >> >> ; User spoel (236) >> ; Wed Nov 3 17:12:44 1993 >> ; Input file >> ; >> title = Yo >> cpp = /usr/bin/cpp >> define = -DDISRES >> constraints = none >> ;constraint_algorithm = lincs >> ;lincs_order = 4 >> integrator = md >> dt = 0.001 ; ps ! >> nsteps = 4000000 ; total 2.0ns. >> nstcomm = 1 >> nstxout = 50000 >> nstvout = 50000 >> nstfout = 50000 >> nstlog = 50000 >> nstenergy = 500 >> nstlist = 10 >> ns_type = grid >> rlist = 1.0 >> coulombtype = PME >> rcoulomb = 1.0 >> vdwtype = cut-off >> rvdw = 1.4 >> fourierspacing = 0.12 >> fourier_nx = 0 >> fourier_ny = 0 >> fourier_nz = 0 >> pme_order = 4 >> ewald_rtol = 1e-5 >> optimize_fft = yes >> disre = simple >> disre_weighting = equal >> ; Berendsen temperature coupling is on in two groups >> Tcoupl = V-rescale >> tc-grps = Protein Non-Protein >> tau_t = 0.1 0.1 >> ref_t = 300 300 >> ; Energy monitoring >> energygrps = Protein Non-Protein >> ;tnc Non-Protein tnt NMR tni >> ; Pressure coupling is not on >> Pcoupl = parrinello-rahman >> tau_p = 0.5 >> compressibility = 4.5e-5 >> ref_p = 1.0 >> ;simulated annealing >> ;Type of annealing form each temperature group >> (no/single/periodic) >> ;annealing = no, no, no, single, no >> ; >> ;Number of annealing points to use for specifying >> annealing in >> each group >> ;annealing_npoints = 0, 0, 0, 9, 0 >> ; >> ; List of times at the annealing points for each group >> ;annealing_time = 0 25 50 75 100 125 150 175 200 >> ; Temp.at each annealing point, for each group. >> ;annealing_temp = 300 350 400 450 500 450 400 350 300 >> >> * >> distance restraints file* >> >> distance_restraints ] >> ; ai aj type index type' low >> up1 up2 fac >> ;TnT240-TnI131, 145, 151, 160, 167 (ca+-7) >> 2019 3889 1 1 1 3.91 >> 3.91 5.31 0.574679 >> 2019 4056 1 2 1 4.86 >> 4.86 6.26 0.409911 >> 2019 4133 1 3 1 5.69 >> 5.69 7.09 0.457947 >> 2019 4207 1 4 1 6.63 >> 6.63 8.03 0.323852 >> 2019 4273 1 5 1 7.14 >> 7.14 8.54 0.294559 >> ;TnT276- Tni 131,145,151,160,167,5,17,27,40 >> 2434 3889 1 6 1 1.34 >> 1.34 2.74 4.884769 >> 2434 4056 1 7 1 2.13 >> 2.13 3.53 0.523368 >> 2434 4133 1 8 1 3.66 >> 3.66 5.06 0.409911 >> 2434 4207 1 9 1 4.48 >> 4.48 5.88 0.547825 >> 2434 4273 1 10 1 5.43 >> 5.43 6.83 0.285938 >> 2434 2628 1 11 1 5.89 >> 5.89 7.29 0.241333 >> 2434 2719 1 12 1 4.76 >> 4.76 6.16 0.366358 >> 2434 2824 1 13 1 3.81 >> 3.81 5.21 0.644145 >> 2434 2972 1 14 1 3.10 >> 3.10 4.50 0.431009 >> ;TnT288- Tni 131,145,151,160,167,5,17,27,40 >> 2557 3889 1 15 1 1.89 >> 1.89 3.29 1.429688 >> 2557 4056 1 16 1 3.25 >> 3.25 4.65 0.32931 >> 2557 4133 1 17 1 4.44 >> 4.44 5.84 0.346847 >> 2557 4207 1 18 1 4.80 >> 4.80 6.20 0.275198 >> 2557 4273 1 19 1 5.84 >> 5.84 7.24 0.200744 >> 2557 2628 1 20 1 4.79 >> 4.79 6.19 1.046736 >> 2557 2719 1 21 1 5.06 >> 5.06 6.46 0.267659 >> ; 2557 2824 1 22 1 >> 2557 2972 1 23 1 3.99 >> 3.99 5.39 0.412797 >> >> >> >> >> >> >> ------------------------------------------------------------------------ >> >> _______________________________________________ >> gmx-users mailing list gmx-users@gromacs.org >> <mailto:gmx-users@gromacs.org> >> <mailto:gmx-users@gromacs.org <mailto:gmx-users@gromacs.org >> >> >> >> http://lists.gromacs.org/mailman/listinfo/gmx-users >> Please search the archive at http://www.gromacs.org/search >> before posting! >> Please don't post (un)subscribe requests to the list. Use >> the >> www interface or send it to gmx-users-requ...@gromacs.org >> <mailto:gmx-users-requ...@gromacs.org> >> <mailto:gmx-users-requ...@gromacs.org >> <mailto:gmx-users-requ...@gromacs.org>>. >> >> Can't post? Read >> http://www.gromacs.org/mailing_lists/users.php >> >> >> -- ======================================== >> >> Justin A. Lemkul >> Ph.D. Candidate >> ICTAS Doctoral Scholar >> Department of Biochemistry >> Virginia Tech >> Blacksburg, VA >> jalemkul[at]vt.edu <http://vt.edu> <http://vt.edu> | (540) >> 231-9080 >> >> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin >> >> ======================================== >> _______________________________________________ >> gmx-users mailing list gmx-users@gromacs.org >> <mailto:gmx-users@gromacs.org> >> <mailto:gmx-users@gromacs.org <mailto:gmx-users@gromacs.org>> >> >> http://lists.gromacs.org/mailman/listinfo/gmx-users >> Please search the archive at http://www.gromacs.org/searchbefore >> posting! >> Please don't post (un)subscribe requests to the list. Use the >> www >> interface or send it to gmx-users-requ...@gromacs.org >> <mailto:gmx-users-requ...@gromacs.org> >> <mailto:gmx-users-requ...@gromacs.org >> <mailto:gmx-users-requ...@gromacs.org>>. >> >> Can't post? Read http://www.gromacs.org/mailing_lists/users.php >> >> >> >> >> -- Jayasundar Jayant James >> >> www.chick.com/reading/tracts/0096/0096_01.asp >> <http://www.chick.com/reading/tracts/0096/0096_01.asp> >> <http://www.chick.com/reading/tracts/0096/0096_01.asp>) >> >> >> -- ======================================== >> >> Justin A. Lemkul >> Ph.D. Candidate >> ICTAS Doctoral Scholar >> Department of Biochemistry >> Virginia Tech >> Blacksburg, VA >> jalemkul[at]vt.edu <http://vt.edu> | (540) 231-9080 >> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin >> >> ======================================== >> _______________________________________________ >> gmx-users mailing list gmx-users@gromacs.org >> <mailto:gmx-users@gromacs.org> >> http://lists.gromacs.org/mailman/listinfo/gmx-users >> Please search the archive at http://www.gromacs.org/search before >> posting! >> Please don't post (un)subscribe requests to the list. Use the www >> interface or send it to gmx-users-requ...@gromacs.org >> <mailto:gmx-users-requ...@gromacs.org>. >> Can't post? Read http://www.gromacs.org/mailing_lists/users.php >> >> >> >> >> -- >> Jayasundar Jayant James >> >> www.chick.com/reading/tracts/0096/0096_01.asp < >> http://www.chick.com/reading/tracts/0096/0096_01.asp>) >> >> > -- > ======================================== > > Justin A. Lemkul > Ph.D. Candidate > ICTAS Doctoral Scholar > Department of Biochemistry > Virginia Tech > Blacksburg, VA > jalemkul[at]vt.edu | (540) 231-9080 > http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin > > ======================================== > _______________________________________________ > gmx-users mailing list gmx-users@gromacs.org > http://lists.gromacs.org/mailman/listinfo/gmx-users > Please search the archive at http://www.gromacs.org/search before posting! > Please don't post (un)subscribe requests to the list. Use the www interface > or send it to gmx-users-requ...@gromacs.org. > Can't post? Read http://www.gromacs.org/mailing_lists/users.php > -- Jayasundar Jayant James www.chick.com/reading/tracts/0096/0096_01.asp)
_______________________________________________ gmx-users mailing list gmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/mailing_lists/users.php