Amit Choubey wrote:
Hi Mark,

I am not using PME calculation.

I was hoping mdrun will do the cell allocation itself.


It will, unless it can't, which is exactly your problem. Mark's point stands, regardless of whether or not you're using PME. DD requires certain minimum size requirements (which are discussed in the manual and the Gromacs 4 paper), so you have two choices:

1. Read about the options mdrun is telling you about.
2. Use fewer nodes so that the DD algorithm can construct reasonably-sized 
domains.

-Justin

Thanks,
Amit


On Fri, Feb 19, 2010 at 2:27 PM, Mark Abraham <[email protected] <mailto:[email protected]>> wrote:

    ----- Original Message -----
    From: Amit Choubey <[email protected] <mailto:[email protected]>>
    Date: Saturday, February 20, 2010 8:51
    Subject: [gmx-users] domain decomposition and load balancing
    To: Discussion list for GROMACS users <[email protected]
    <mailto:[email protected]>>

     > Hi Everyone,
     > I am trying to run a simulation with the option "pbc=xy" turned
    on. I am using 64 processors for the simulation. The mdrun_mpi
    evokes the following error message before starting the md steps
     >
     > There is no domain decomposition for 64 nodes that is compatible
    with the given box and a minimum cell size of 0.889862 nm> Change
    the number of nodes or mdrun option -rdd or -dds>
    Look in the log file for details on the domain decomposition>
     > This has to do with the load balancing in the domain
    decomposition version of mdrun. Can anyone suggest me how to set the
    option -rdd or -dds?

    Those options are not normally the problem - but see the log file
    for info and mdrun -h for instructions.

    You should read up in the manual about domain decomposition, and see
    about choosing npme such that 64-npme is a number that is suitably
    composite that you can make a reasonably compact 3D grid so that the
    minimum cell size is not a constraint. Cells have to be large enough
    that all nonbonded interactions can be resolved in consultation with
    at most nearest-neighbour cells (and some other constraints).

    I'm assuming pbc=xy requires a 2D DD. For example, npme=19 gives
    npp=45 gives 9x5x1, but npme=28 gives npp=36 gives 6x6x1, which
    allows for the cells to have the smallest diameter possible. Of
    course if your simulation box is so small that the 2D DD for pbc=xy
    will always lead to slabs that are too small in one dimension then
    you can't solve this problem with DD.

    If pbc=xy permits a 3D DD, then the same considerations apply.
    npme=19 gives 5x3x3 but npme=28 allows 4x3x3

     > Also the simulation runs fine on one node (with domain
    decomposition) and with particle decomposition but both of them
    extremely slow.>

    Well, that's normal...

    Mark
    --
    gmx-users mailing list    [email protected]
    <mailto:[email protected]>
    http://lists.gromacs.org/mailman/listinfo/gmx-users
    Please search the archive at http://www.gromacs.org/search before
    posting!
    Please don't post (un)subscribe requests to the list. Use the
    www interface or send it to [email protected]
    <mailto:[email protected]>.
    Can't post? Read http://www.gromacs.org/mailing_lists/users.php



--
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

========================================
--
gmx-users mailing list    [email protected]
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to [email protected].
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Reply via email to