Jesmin,
First of all, have you read the following pages?
http://www.gromacs.org/Documentation/Acceleration_and_parallelization
http://www.gromacs.org/Documentation/Cut-off_schemes
To summarize:
- OpenMP parallelization is supported in Verlet scheme for the full mdrun
and for separate PME nodes al
Dear Justin,
I want to compare the performance of MPI+OpenMP implementation of
Gromacs GB-energy with other MPI+X implementation of GB-energy.
So, my question is:
Is there any way to get sensible result for the implicit solvent
simulation using Gromacs that employs both MPI and OpenMP?
If yes, th
On 2/27/13 11:27 AM, jesmin jahan wrote:
Thanks again Justin,
I added
cutoff-scheme = group
But still I am getting
OpenMP threads have been requested with cut-off scheme Group, but
these are only supported with cut-off scheme Verlet
which really says, openmp threads are supported onl
Thanks again Justin,
I added
cutoff-scheme = group
But still I am getting
OpenMP threads have been requested with cut-off scheme Group, but
these are only supported with cut-off scheme Verlet
which really says, openmp threads are supported only in Verlet!
constraints = none
int
On 2/27/13 10:54 AM, jesmin jahan wrote:
Many thanks to you Justin for your help.
Now my .mdp file looks like:
constraints = none
integrator = md
pbc = no
verlet-buffer-drift = -1
dt = 0.001
nsteps = 0
ns_type =
Many thanks to you Justin for your help.
Now my .mdp file looks like:
constraints = none
integrator = md
pbc = no
verlet-buffer-drift = -1
dt = 0.001
nsteps = 0
ns_type = simple
comm-mode = angular
rlist
On 2/27/13 10:25 AM, jesmin jahan wrote:
Thanks Justin.
Well, I am felling stupid now! Could you kindly provide me some better
.mdp parameters that make more sense?
gromacs 4.6 gives me error with my previous parameters which were
working perfect in gromacs 4.5.3
It says we have to use cut-of
Thanks Justin.
Well, I am felling stupid now! Could you kindly provide me some better
.mdp parameters that make more sense?
gromacs 4.6 gives me error with my previous parameters which were
working perfect in gromacs 4.5.3
It says we have to use cut-off verlet, either we have to use coupling
temp
On 2/27/13 10:01 AM, jesmin jahan wrote:
Hi Justin,
Thanks for your reply.
I am able to run it now. I am using the following parameters.
--
constraints = none
integrator
Hi Justin,
Thanks for your reply.
I am able to run it now. I am using the following parameters.
--
constraints = none
integrator = md
cutoff-scheme = Verlet
pbc
On 2/27/13 8:53 AM, jesmin jahan wrote:
Thanks Crasten for your reply.
Even after removing comment, I am getting the same error message.
You need to re-create the .tpr file after modifying the .mdp file. If you're
getting the same error, you're using the wrong .tpr file.
Do you have any
Thanks Crasten for your reply.
Even after removing comment, I am getting the same error message.
Do you have any sample .mdp file that can be used while using MPI+OPENMP?
I am trying to compute the GB energy (implicit solvent based).
Thanks,
Jesmin
On Wed, Feb 27, 2013 at 2:59 AM, Carsten Kutzn
Hi,
On Feb 27, 2013, at 6:55 AM, jesmin jahan wrote:
> Dear Gromacs Users,
>
> I am trying to run the following command on gromacs 4.6
>
> mdrun -ntmpi 2 -ntomp 6 -s imd.tpr
>
> But I am getting the following error
>
> OpenMP threads have been requested with cut-off scheme Group, but
> these
Dear Gromacs Users,
I am trying to run the following command on gromacs 4.6
mdrun -ntmpi 2 -ntomp 6 -s imd.tpr
But I am getting the following error
OpenMP threads have been requested with cut-off scheme Group, but
these are only supported with cut-off scheme Verlet
Does any one know a solution
14 matches
Mail list logo