Dear gmxusers,
 
I have a very basic question here which I hope someone could help me with. I 
was running a couple of simulations over the weekend on a shared cluster and 
both came to a stop for the same reasons:
 
Program mdrun_mpi, VERSION 4.0.7
Source code file: trnio.c, line: 252
File input/output error:
Cannot write trajectory frame; maybe you are out of quota?
 
Indeed there were quite a number of large files in my user directory (e.g. the 
.trr files and etc). I think the problems probably arise from the fact that (i) 
I am storing my trajectory as full precision .trr files and (ii) setting too 
small a value for nstxout and nstvout in .mdp. 
 
I have seen some tutorials that suggested using "trjconv" to derive the reduced 
precision .xtc files from the .trr files and then discard the latter. My 
question is, would this be wise as I am not sure whether I would find myself 
needing these .trr files in the future.
 
Here's a snippet of my .mdp file. Am I saving my coordinates and velocities too 
frequently? If I were to increase this, are there any compelling factors that I 
need to take into considerations?
; Run parameters
integrator      = md            ; leap-frog integrator
nsteps          = 500000        ; 2 * 500000 = 1000 ps (1 ns)
dt              = 0.002         ; 2 fs
; Output control
nstxout         = 100           ; save coordinates every 0.2 ps
nstvout         = 100           ; save velocities every 0.2 ps
nstenergy       = 100           ; save energies every 0.2 ps
nstlog          = 100           ; update log file every 0.2 ps

 
Many thanks for your help!
 
<< Email has been scanned for viruses by UNMC email management service >>
-- 
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Reply via email to