Fabio Affinito wrote:
This is the mdrun output:

Initializing Domain Decomposition on 4096 nodes
Dynamic load balancing: auto
Will sort the charge groups at every domain (re)decomposition
Initial maximum inter charge-group distances:
    two-body bonded interactions: 30.031 nm, LJ-14, atoms 40702 40705
  multi-body bonded interactions: 30.031 nm, Angle, atoms 40701 40704
Minimum cell size due to bonded interactions: 33.035 nm
Maximum distance for 7 constraints, at 120 deg. angles, all-trans: 1.139 nm
Estimated maximum distance required for P-LINCS: 1.139 nm
Guess for relative PME load: 0.43
Will use 2400 particle-particle and 1696 PME only nodes
This is a guess, check the performance at the end of the log file
Using 1696 separate PME nodes
Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25
Optimizing the DD grid for 2400 cells with a minimum initial size of 41.293 nm
The maximum allowed number of cells is: X 0 Y 0 Z 0

-------------------------------------------------------
Program mdrun_mpi_bg, VERSION 4.5.4
Source code file: domdec.c, line: 6438

Fatal error:
There is no domain decomposition for 2400 nodes that is compatible with the given box and a minimum cell size of 41.2932 nm
Change the number of nodes or mdrun option -rdd or -dds
Look in the log file for details on the domain decomposition
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------


This is bizarre. Did you fix the periodicity of the original coordinate file (conf_start.gro) and rebuild the system, or did you try to run trjconv on the replicated system? The former is the correct approach.

-Justin

--
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

========================================
--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to