Using ICC 13.0, I got a same result.
--
View this message in context:
http://gromacs.5086.n6.nabble.com/gromacs-4-6-GB-SA-problem-and-poor-performance-tp5004729p5004922.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
--
gmx-users mailing listgmx-users@gromacs.or
Input files: conf.gro and mdp files
http://www.gromacs.org/Documentation/Installation_Instructions_4.5/GROMACS-OpenMM
cpu-imp-RF-inf.mdp
constraints = all-bonds
integrator = md
dt = 0.002; ps !
nsteps = 0
nstlist = 0
ns_type
Im trying to run an md or em using an implicit solvation method using
gromacs 4.6 but I always get the incorrect result.
ICC version : icc 11.0
fftw version : 3.2.2
benchmark system is gromacs-gpubench
gromacs-gpubench-dhfr.tar/CPU/dhfr-impl-inf.bench
Angle,Proper Dih,Imp Dih,Nonpolar sol,
I tried openmpi v 1.3.3
but, I got a same error.
mdrun_mpi -multi works fine. REMD has a problem.
## error message ##
step 500, will finish Fri Aug 13 16:43:25 2010[localhost:20171] *** Process
received signal ***
[localhost:20172] *** Process received signal ***
[localhost:20172] Signal: Seg
Hello!
I'm doing a simple REMD test with 4 replicas.
Time step : 2 fs
Exchange : every 500fs
md_0.tpr md_1.tpr md_2.tpr md_3.tpr
mpiexec(or mpirun) -np 4 mdrun_mpi_d -deffnm md_ -multi 4 -replex 200
I got a error message.
##error##
100 steps, 2000.0 ps.
step 600 rank 3 in job 10 localho
Hello everyone, I'm trying to perform REMD simulation at NPT emsemble.
Here I have some question.
1) Using parrinello-rahman pressure coupling, REMD simulation is unstable at
high temperature.
(highest temperature ~400K).
2) Using berendsen T-coupling and P-coupling scheme..(tau_t = 0.1ps
6 matches
Mail list logo