Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Mark Abraham
Lennard-Jones PME is planned for 5.0 Mark On Aug 28, 2013 8:36 AM, "Gianluca Interlandi" wrote: > Hi! > > Just wondering whether gromacs has (or plans to implement) a correction > for the loss of long range LJ interactons? Something similar to > LJcorrection in NAMD or IPS in CHARMM. > > Thanks!

Re: [gmx-users] Gentle heating with implicit solvent

2013-08-28 Thread Mark Abraham
It can be. Lack of explicit degrees of freedom of solvent can make achieving equipartition tricky. With CHARMM27 and virtual sites in implicit solvent, I have sometimes found it necessary to use a sub-femtosecond time step at the start of equilibration, even where there were no atomic clashes. Mayb

Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread rajat desikan
Hi, What is LJ PME? I googled it and got this publication? http://pubs.acs.org/doi/abs/10.1021/ct400146w So, LJ will not be cut off at some r, but you will have a real+fourier part similar to electrostatics. Is that LJ PME? What are the advantages? On Wed, Aug 28, 2013 at 12:36 PM, Mark Abraham

Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread David van der Spoel
On 2013-08-28 09:31, rajat desikan wrote: Hi, What is LJ PME? I googled it and got this publication? http://pubs.acs.org/doi/abs/10.1021/ct400146w So, LJ will not be cut off at some r, but you will have a real+fourier part similar to electrostatics. Is that LJ PME? What are the advantages? htt

Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Mark Abraham
Secondarily, one could use the same cut-off for LJ and electrostatics, and treat their respective lattice components however you like. This simplifies implementations for computing short-ranged interactions, while facilitating iso-accuracy load balancing across heterogeneous compute units. Mark O

[gmx-users] total charge

2013-08-28 Thread Group Gro
Hi Dear Gromacs users, I have a question about the total charge of a system. I executed pdb2gmx command which the result is quited below: "Keeping all generated dihedrals Making cmap torsions...There are 7808 dihedrals,  591 impropers, 5298 angles   7596 pairs, 2922 bonds and 0 vi

Re: [gmx-users] total charge

2013-08-28 Thread Mark Abraham
If you're not sure what charge your chains should have had, then you should go back and think about the titratable residues and what you're trying to model. Don't assume some code's defaults are what you want! The earlier mention of "total charge -3" refers to one of the chains, but pdb2gmx is not

[gmx-users] DMPC Bilayer

2013-08-28 Thread Rama
Hello, At NPT stage the two leaflets in DMPC bilayer is separated a while and comes closer. Is this common in this stage or any thing goes wrong in equillibration. Thanks --Rama -- View this message in context: http://gromacs.5086.x6.nabble.com/DMPC-Bilayer-tp5010783.html Sent from the GROM

Re: [gmx-users] DMPC Bilayer

2013-08-28 Thread Justin Lemkul
On 8/28/13 11:12 AM, Rama wrote: Hello, At NPT stage the two leaflets in DMPC bilayer is separated a while and comes closer. Is this common in this stage or any thing goes wrong in equillibration. Depending on what the previous preparation steps were, this can certainly occur. -Justin --

[gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Albert
Hello: I am constraining one part of the protein and trying to generate md.tpr with command: grompp -f md.mdp -c npt4.gro -n -o md.tpr it works fine in 4.6.3, but it failed in 4.5.5 with following warning messages: WARNING 1 [file md.mdp, line 65]: Unknown left-hand 'cutoff-scheme' in p

[gmx-users] problem of submitting job in HPC

2013-08-28 Thread Albert
Hello: I am trying to use following command to run 4.6.3 in a HPC cluster: mpiexec -n 32 /opt/gromacs/4.6.3/bin/mdrun_mpi -dlb yes -v -s md.tpr -x md.xtc -o md.trr -g md.log -e md.edr >& md.info the 4.5.5 works fine in this machine with command: mpiexec -n 32 mdrun -nosum -dlb yes -v -s m

Re: [gmx-users] problem of submitting job in HPC

2013-08-28 Thread Justin Lemkul
On 8/28/13 12:39 PM, Albert wrote: Hello: I am trying to use following command to run 4.6.3 in a HPC cluster: mpiexec -n 32 /opt/gromacs/4.6.3/bin/mdrun_mpi -dlb yes -v -s md.tpr -x md.xtc -o md.trr -g md.log -e md.edr >& md.info the 4.5.5 works fine in this machine with command: mpiexe

Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Justin Lemkul
On 8/28/13 11:48 AM, Albert wrote: Hello: I am constraining one part of the protein and trying to generate md.tpr with command: grompp -f md.mdp -c npt4.gro -n -o md.tpr it works fine in 4.6.3, but it failed in 4.5.5 with following warning messages: WARNING 1 [file md.mdp, line 65]: Unk

Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Albert
On 08/28/2013 07:07 PM, Justin Lemkul wrote: WARNING 2 [file helix.itp, line 1]: Too few parameters on line (source file toppush.c, line 1501) Looks concerning - what's line 1? here is the initial lines: ; position restraints for part of C-alpha of Protein [ position_restraints ] ; i fun

Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Justin Lemkul
On 8/28/13 1:21 PM, Albert wrote: On 08/28/2013 07:07 PM, Justin Lemkul wrote: WARNING 2 [file helix.itp, line 1]: Too few parameters on line (source file toppush.c, line 1501) Looks concerning - what's line 1? here is the initial lines: ; position restraints for part of C-alpha of Prot

Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Albert
On 08/28/2013 07:25 PM, Justin Lemkul wrote: Looks normal, so without context of how it is #included, there's not much to diagnose here. here is my #include in topol.top file: ; Include Position restraint file #ifdef POSRES #include "restrain.itp" #endif I first generate restrain for all

Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Justin Lemkul
On 8/28/13 1:36 PM, Albert wrote: On 08/28/2013 07:25 PM, Justin Lemkul wrote: Looks normal, so without context of how it is #included, there's not much to diagnose here. here is my #include in topol.top file: ; Include Position restraint file #ifdef POSRES #include "restrain.itp" #endif

Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Albert
On 08/28/2013 07:38 PM, Justin Lemkul wrote: That's not the problem. It's complaining about whatever is on line 1 (not clear from the previous message if the comment line is #1 or a blank line), so assuming that the #ifdef is in the right place (probably is, or the error would be different),

Re: [gmx-users] problem of submitting job in HPC

2013-08-28 Thread Mark Abraham
On Wed, Aug 28, 2013 at 7:06 PM, Justin Lemkul wrote: > > > On 8/28/13 12:39 PM, Albert wrote: >> >> Hello: >> >> I am trying to use following command to run 4.6.3 in a HPC cluster: >> >> mpiexec -n 32 /opt/gromacs/4.6.3/bin/mdrun_mpi -dlb yes -v -s md.tpr -x >> md.xtc >> -o md.trr -g md.log -e

Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Gianluca Interlandi
Thanks for your replies, Mark. What do you think about the current DispCorr option in gromacs? Is it worth it trying it? Also, I wonder whether using DispCorr for LJ + PME for Cb justifies reducing the cutoff for non-bonded to 1 nm with the CHARMM force field, where 1.2 nm is usually recommende

Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Justin Lemkul
On 8/28/13 7:28 PM, Gianluca Interlandi wrote: Thanks for your replies, Mark. What do you think about the current DispCorr option in gromacs? Is it worth it trying it? Also, I wonder whether using DispCorr for LJ + PME for Cb justifies reducing the cutoff for non-bonded to 1 nm with the CHARMM

[gmx-users] CGennFF in gromacs

2013-08-28 Thread Golshan Hejazi
Hello everyone, I want to use CGennFF force field in GROMACS. I downloaded the Cgenffbon.itp and  Cgenffnb.itp files and I put them in charmm36.ff directory.  - I replaced the lines in the forcefield.itp to #include "Cgenffbon.itp"  and #include "Cgenffnb.itp"  - I modified the rtp file and I i

Re: [gmx-users] CGennFF in gromacs

2013-08-28 Thread Justin Lemkul
On 8/28/13 8:23 PM, Golshan Hejazi wrote: Hello everyone, I want to use CGennFF force field in GROMACS. I downloaded the Cgenffbon.itp and Cgenffnb.itp files and I put them in charmm36.ff directory. - I replaced the lines in the forcefield.itp to #include "Cgenffbon.itp" and #include "Cge

Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Gianluca Interlandi
Current CHARMM development relies on a 1.2-nm cutoff for LJ, so that's how we balance all of the forces during parameterization. Ok, I agree. What about the use of PME for Coulomb? The CHARMM PARAM22 force field was parametrized using SHIFT on electrostatic forces making it zero after 12 A (pl

Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Justin Lemkul
On 8/28/13 9:09 PM, Gianluca Interlandi wrote: Current CHARMM development relies on a 1.2-nm cutoff for LJ, so that's how we balance all of the forces during parameterization. Ok, I agree. What about the use of PME for Coulomb? The CHARMM PARAM22 force field was parametrized using SHIFT on el

Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Gianluca Interlandi
Justin, I respect your opinion on this. However, in the paper indicated below by BR Brooks they used a cutoff of 10 A on LJ when testing IPS in CHARMM: Title: Pressure-based long-range correction for Lennard-Jones interactions in molecular dynamics simulations: Application to alkanes and inte

[gmx-users] ERROR : GROMACS finsihed with error 74

2013-08-28 Thread sri2201
Dear Gromacs , I running the md simulation for protein complex (44 kd ) with amber99sb-ildn force filed . iam getting error as follows , looks like it is syntax error . Input file: gmx-495644.pdb Base name: gmx-495644 Source directory: /scratch/home/enmr028/home_cream_840250368/CREAM840250368 GR

[gmx-users] MD vs. free energy simulations

2013-08-28 Thread Jernej Zidar
Hi, I ran some MD simulations (NPT ensemble) and a series of simulations to determine the free energy of water solvation of a not to big molecule. I noticed that while I was able to run the MD simulations using all the CPUs (or threads) in my workstation (12 CPUs or 24 threads, respectively),

Re: [gmx-users] problem of submitting job in HPC

2013-08-28 Thread Albert
hello Mark: thanks a lot for kind advices. Here is my log file for output of mdrun -version, There are always some duplicate informations and files Program: mdrun_mpi Program: mdrun_mpi Program: mdrun_mpi Program: mdrun_mpi Program: mdrun_mpi Program: mdrun_mpi Program: mdrun_mpi Program: mdru