Thanks Mark, I did play around with the -npme option a little and turns out using -npme 1 works for the system this is mdrun command I have in the script file
mdrun_openmpi -nosum -dlb yes -npme 1 -cpt 40 -maxh 48 -deffnm md

Thanks.

Quoting Mark Abraham <mark.abra...@anu.edu.au>:

----- Original Message -----
From: nishap.pa...@utoronto.ca
Date: Monday, May 17, 2010 10:28
Subject: Re: [gmx-users] Domain decomposition error
To: gmx-users@gromacs.org

Thanks Justin. But how come it worked for methanol. The system
is of the same size , and all the parameters are same, so I
don't understand why it won't work for ethanol.

I'd guess that the greater size of ethanol in combination with constraints is running afoul of the DD minimum cell-size requirements. As you will see in reading the .log file, for your ethanol, P-LINCS requires at least 0.497nm given the initial condition of your system. DD fudges that up by a factor of 1.25 to give some flexibility. Given that minimimum size requirement, only 6 cells can result from a partition in any dimension of a 4nm x 4nm x 4nm box. You've probably artificially required DD to use 14 processors with -npme 2. That requires a 14x1x1 or 7x2x1 DD, neither of which can be consistent with the combination of your box size and constraint usage. Methanol will have a smaller constraint (see its .log file to compare), so the DD will be legal. If you'd given your full mdrun command line in your post, then I wouldn't be guessing as much...

The .log file recommends a range of useful solutions for you to consider - but it makes no suggests regarding your .mdp file. Roughly speaking, the .mdp file normally describes the model physics and controls the output conditions, and the (sometimes implicit) arguments of mdrun describe the implemention of the resulting algorithm. You're requiring an impossible implementation.

Simplest is to not specify -npme unless you know you need to. For efficiency, both -npme and "-np less -npme" need to be sufficiently composite and preferably with a high common factor of two of their factors. If you leave mdrun alone, it will guess reasonably. For example, -npme=4 giving 12 DD nodes decomposed 4x3x1 will work admirably in your case, and I bet that's what mdrun picks.

Mark

Quoting "Justin A. Lemkul" <jalem...@vt.edu>:

>
>
>nishap.pa...@utoronto.ca wrote:
>>Hello,
>>
>>  I got this following error when I was trying to run a
simulation  of ethanol-water box size 4*4*4 nm (~6530 atoms).
>>
>>Fatal error:
>>There is no domain decomposition for 14 nodes that is
compatible  with the given box and a minimum cell size of
0.62175 nm
>>Change the number of nodes or mdrun option -rcon or -dds or
your  LINCS settings
>>Look in the log file for details on the domain decomposition.
>>
>>I looked into my log file and this is what I got:
>>
>>Initializing Domain Decomposition on 16 nodes
>>Dynamic load balancing: yes
>>Will sort the charge groups at every domain (re)decomposition
>>Initial maximum inter charge-group distances:
>>   two-body bonded interactions: 0.234 nm, LJ-14,
atoms 1 9
>> multi-body bonded interactions: 0.234 nm, Angle, atoms 2 5
>>Minimum cell size due to bonded interactions: 0.257 nm
>>Maximum distance for 5 constraints, at 120 deg. angles, all-
trans: 0.497 nm
>>Estimated maximum distance required for P-LINCS: 0.497 nm
>>This distance will limit the DD cell size, you can override
this with -rcon
>>Guess for relative PME load: 0.13
>>Will use 14 particle-particle and 2 PME only nodes
>>This is a guess, check the performance at the end of the log file
>>Using 2 separate PME nodes
>>Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25
>>Optimizing the DD grid for 14 cells with a minimum initial
size of 0.622 nm
>>The maximum allowed number of cells is: X 6 Y 6 Z 6
>>
>>I am using the same .mdp file that I used for methanol, and it
is  working fine, I don't understand why it's giving me
problem for  ethanol. I am running my simulation on 2 nodes.
>>
>
>Not according to the log file messages.  You're running on
16 nodes,
>with 14 for PP and 2 for PME.  Your system is of
insufficient size to
>be divided over this many PP nodes.  Check the list
archive for tips,
>or simply use less nodes for the calculation.  There is also
>information in the manual about all of the settings that are
mentioned>in the log file.
>
>-Justin
>
>>Suggestions?
>>
>>Thanks
>>-Nisha P
>>
>
>--
>========================================
>
>Justin A. Lemkul
>Ph.D. Candidate
>ICTAS Doctoral Scholar
>MILES-IGERT Trainee
>Department of Biochemistry
>Virginia Tech
>Blacksburg, VA
>jalemkul[at]vt.edu | (540) 231-9080
>http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
>========================================
>--
>gmx-users mailing list    gmx-users@gromacs.org
>http://lists.gromacs.org/mailman/listinfo/gmx-users
>Please search the archive at http://www.gromacs.org/search
before posting!
>Please don't post (un)subscribe requests to the list. Use the www
>interface or send it to gmx-users-requ...@gromacs.org.
>Can't post? Read http://www.gromacs.org/mailing_lists/users.php



--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search
before posting!
Please don't post (un)subscribe requests to the list. Use thewww
interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php
--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php


--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Reply via email to