Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Mark Abraham
Lennard-Jones PME is planned for 5.0

Mark
On Aug 28, 2013 8:36 AM, "Gianluca Interlandi" 
wrote:

> Hi!
>
> Just wondering whether gromacs has (or plans to implement) a correction
> for the loss of long range LJ interactons? Something similar to
> LJcorrection in NAMD or IPS in CHARMM.
>
> Thanks!
>
>  Gianluca
>
> --**---
> Gianluca Interlandi, PhD gianl...@u.washington.edu
> +1 (206) 685 4435
> 
> http://artemide.bioeng.**washington.edu/
>
> Research Scientist at the Department of Bioengineering
> at the University of Washington, Seattle WA U.S.A.
> http://healthynaturalbaby.org
> --**---
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/**mailman/listinfo/gmx-users
> * Please search the archive at http://www.gromacs.org/**
> Support/Mailing_Lists/Searchbefore
>  posting!
> * Please don't post (un)subscribe requests to the list. Use the www
> interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read 
> http://www.gromacs.org/**Support/Mailing_Lists
>
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Gentle heating with implicit solvent

2013-08-28 Thread Mark Abraham
It can be. Lack of explicit degrees of freedom of solvent can make
achieving equipartition tricky. With CHARMM27 and virtual sites in implicit
solvent, I have sometimes found it necessary to use a sub-femtosecond time
step at the start of equilibration, even where there were no atomic
clashes. Maybe the system was just unlucky with generating velocities,
though :-)

Mark
On Aug 28, 2013 7:16 AM, "Gianluca Interlandi" 
wrote:

> How important is it to do gentle heating (using simulated annealing) with
> GBSA? Often with explicit water it is enough to perform some equilibration
> with positional restraints. Would it be enough to do the same with implicit
> solvent?
>
> Thanks,
>
>  Gianluca
>
> --**---
> Gianluca Interlandi, PhD gianl...@u.washington.edu
> +1 (206) 685 4435
> 
> http://artemide.bioeng.**washington.edu/
>
> Research Scientist at the Department of Bioengineering
> at the University of Washington, Seattle WA U.S.A.
> http://healthynaturalbaby.org
> --**---
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/**mailman/listinfo/gmx-users
> * Please search the archive at http://www.gromacs.org/**
> Support/Mailing_Lists/Searchbefore
>  posting!
> * Please don't post (un)subscribe requests to the list. Use the www
> interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read 
> http://www.gromacs.org/**Support/Mailing_Lists
>
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread rajat desikan
Hi,
What is LJ PME? I googled it and got this publication?
http://pubs.acs.org/doi/abs/10.1021/ct400146w

So, LJ will not be cut off at some r, but you will have a real+fourier part
similar to electrostatics. Is that LJ PME? What are the advantages?


On Wed, Aug 28, 2013 at 12:36 PM, Mark Abraham wrote:

> Lennard-Jones PME is planned for 5.0
>
> Mark
> On Aug 28, 2013 8:36 AM, "Gianluca Interlandi" 
> wrote:
>
> > Hi!
> >
> > Just wondering whether gromacs has (or plans to implement) a correction
> > for the loss of long range LJ interactons? Something similar to
> > LJcorrection in NAMD or IPS in CHARMM.
> >
> > Thanks!
> >
> >  Gianluca
> >
> > --**---
> > Gianluca Interlandi, PhD gianl...@u.washington.edu
> > +1 (206) 685 4435
> > http://artemide.bioeng.**washington.edu/<
> http://artemide.bioeng.washington.edu/>
> >
> > Research Scientist at the Department of Bioengineering
> > at the University of Washington, Seattle WA U.S.A.
> > http://healthynaturalbaby.org
> > --**---
> > --
> > gmx-users mailing listgmx-users@gromacs.org
> > http://lists.gromacs.org/**mailman/listinfo/gmx-users<
> http://lists.gromacs.org/mailman/listinfo/gmx-users>
> > * Please search the archive at http://www.gromacs.org/**
> > Support/Mailing_Lists/Search<
> http://www.gromacs.org/Support/Mailing_Lists/Search>before posting!
> > * Please don't post (un)subscribe requests to the list. Use the www
> > interface or send it to gmx-users-requ...@gromacs.org.
> > * Can't post? Read http://www.gromacs.org/**Support/Mailing_Lists<
> http://www.gromacs.org/Support/Mailing_Lists>
> >
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>



-- 
Rajat Desikan (Ph.D Scholar)
Prof. K. Ganapathy Ayappa's Lab (no 13),
Dept. of Chemical Engineering,
Indian Institute of Science, Bangalore
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread David van der Spoel

On 2013-08-28 09:31, rajat desikan wrote:

Hi,
What is LJ PME? I googled it and got this publication?
http://pubs.acs.org/doi/abs/10.1021/ct400146w

So, LJ will not be cut off at some r, but you will have a real+fourier part
similar to electrostatics. Is that LJ PME? What are the advantages?


http://pubs.acs.org/doi/abs/10.1021/ct400140n


On Wed, Aug 28, 2013 at 12:36 PM, Mark Abraham wrote:


Lennard-Jones PME is planned for 5.0

Mark
On Aug 28, 2013 8:36 AM, "Gianluca Interlandi" 
wrote:


Hi!

Just wondering whether gromacs has (or plans to implement) a correction
for the loss of long range LJ interactons? Something similar to
LJcorrection in NAMD or IPS in CHARMM.

Thanks!

  Gianluca

--**---
Gianluca Interlandi, PhD gianl...@u.washington.edu
 +1 (206) 685 4435
 http://artemide.bioeng.**washington.edu/<

http://artemide.bioeng.washington.edu/>


Research Scientist at the Department of Bioengineering
at the University of Washington, Seattle WA U.S.A.
 http://healthynaturalbaby.org
--**---
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/**mailman/listinfo/gmx-users<

http://lists.gromacs.org/mailman/listinfo/gmx-users>

* Please search the archive at http://www.gromacs.org/**
Support/Mailing_Lists/Search<

http://www.gromacs.org/Support/Mailing_Lists/Search>before posting!

* Please don't post (un)subscribe requests to the list. Use the www
interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/**Support/Mailing_Lists<

http://www.gromacs.org/Support/Mailing_Lists>



--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists








--
David van der Spoel, Ph.D., Professor of Biology
Dept. of Cell & Molec. Biol., Uppsala University.
Box 596, 75124 Uppsala, Sweden. Phone:  +46184714205.
sp...@xray.bmc.uu.sehttp://folding.bmc.uu.se
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Mark Abraham
Secondarily, one could use the same cut-off for LJ and electrostatics,
and treat their respective lattice components however you like. This
simplifies implementations for computing short-ranged interactions,
while facilitating iso-accuracy load balancing across heterogeneous
compute units.

Mark

On Wed, Aug 28, 2013 at 10:08 AM, David van der Spoel
 wrote:
> On 2013-08-28 09:31, rajat desikan wrote:
>>
>> Hi,
>> What is LJ PME? I googled it and got this publication?
>> http://pubs.acs.org/doi/abs/10.1021/ct400146w
>>
>> So, LJ will not be cut off at some r, but you will have a real+fourier
>> part
>> similar to electrostatics. Is that LJ PME? What are the advantages?
>>
> http://pubs.acs.org/doi/abs/10.1021/ct400140n
>
>>
>> On Wed, Aug 28, 2013 at 12:36 PM, Mark Abraham
>> wrote:
>>
>>> Lennard-Jones PME is planned for 5.0
>>>
>>> Mark
>>> On Aug 28, 2013 8:36 AM, "Gianluca Interlandi"
>>> 
>>> wrote:
>>>
 Hi!

 Just wondering whether gromacs has (or plans to implement) a correction
 for the loss of long range LJ interactons? Something similar to
 LJcorrection in NAMD or IPS in CHARMM.

 Thanks!

   Gianluca

 --**---
 Gianluca Interlandi, PhD gianl...@u.washington.edu
  +1 (206) 685 4435
  http://artemide.bioeng.**washington.edu/<
>>>
>>> http://artemide.bioeng.washington.edu/>


 Research Scientist at the Department of Bioengineering
 at the University of Washington, Seattle WA U.S.A.
  http://healthynaturalbaby.org
 --**---
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/**mailman/listinfo/gmx-users<
>>>
>>> http://lists.gromacs.org/mailman/listinfo/gmx-users>

 * Please search the archive at http://www.gromacs.org/**
 Support/Mailing_Lists/Search<
>>>
>>> http://www.gromacs.org/Support/Mailing_Lists/Search>before posting!

 * Please don't post (un)subscribe requests to the list. Use the www
 interface or send it to gmx-users-requ...@gromacs.org.
 * Can't post? Read http://www.gromacs.org/**Support/Mailing_Lists<
>>>
>>> http://www.gromacs.org/Support/Mailing_Lists>


>>> --
>>> gmx-users mailing listgmx-users@gromacs.org
>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>> * Please search the archive at
>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>> * Please don't post (un)subscribe requests to the list. Use the
>>> www interface or send it to gmx-users-requ...@gromacs.org.
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>
>>
>>
>
>
> --
> David van der Spoel, Ph.D., Professor of Biology
> Dept. of Cell & Molec. Biol., Uppsala University.
> Box 596, 75124 Uppsala, Sweden. Phone:  +46184714205.
> sp...@xray.bmc.uu.sehttp://folding.bmc.uu.se
>
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the www
> interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] total charge

2013-08-28 Thread Group Gro
Hi Dear Gromacs users,
I have a question about the total charge of a system. I executed pdb2gmx 
command which the result is quited below:

"Keeping all generated dihedrals
Making cmap torsions...There are 7808 dihedrals,  591 impropers, 5298 angles
  7596 pairs, 2922 bonds and 0 virtual sites
Total mass 20072.826 a.m.u.
Total charge -3.000 e
Writing topology
Including chain 1 in system: 2869 atoms 180 residues
Including chain 2 in system: 2869 atoms 180 residues
Including chain 3 in system: 2869 atoms 180 residues
Now there are 8607 atoms and 540 residues
Total mass in system 60218.478 a.m.u.
Total charge in system -9.000 e"


If I want to neutralize the system with Na atoms, how many atoms should I add? 
3 or 9? I think I need 9 atoms, but I am not sure about that.

Best Regards.
S. Faraji

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] total charge

2013-08-28 Thread Mark Abraham
If you're not sure what charge your chains should have had, then you
should go back and think about the titratable residues and what you're
trying to model. Don't assume some code's defaults are what you want!

The earlier mention of "total charge -3" refers to one of the chains,
but pdb2gmx is not completely clear with its output.

Mark

On Wed, Aug 28, 2013 at 1:25 PM, Group Gro  wrote:
> Hi Dear Gromacs users,
> I have a question about the total charge of a system. I executed pdb2gmx 
> command which the result is quited below:
>
> "Keeping all generated dihedrals
> Making cmap torsions...There are 7808 dihedrals,  591 impropers, 5298 angles
>   7596 pairs, 2922 bonds and 0 virtual sites
> Total mass 20072.826 a.m.u.
> Total charge -3.000 e
> Writing topology
> Including chain 1 in system: 2869 atoms 180 residues
> Including chain 2 in system: 2869 atoms 180 residues
> Including chain 3 in system: 2869 atoms 180 residues
> Now there are 8607 atoms and 540 residues
> Total mass in system 60218.478 a.m.u.
> Total charge in system -9.000 e"
>
>
> If I want to neutralize the system with Na atoms, how many atoms should I 
> add? 3 or 9? I think I need 9 atoms, but I am not sure about that.
>
> Best Regards.
> S. Faraji
>
> --
> gmx-users mailing listgmx-users@gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-requ...@gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] DMPC Bilayer

2013-08-28 Thread Rama

Hello,

At NPT stage the two leaflets in DMPC bilayer is separated a while and comes
closer. Is this common in this stage or any thing goes wrong in
equillibration.


Thanks
--Rama



--
View this message in context: 
http://gromacs.5086.x6.nabble.com/DMPC-Bilayer-tp5010783.html
Sent from the GROMACS Users Forum mailing list archive at Nabble.com.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] DMPC Bilayer

2013-08-28 Thread Justin Lemkul



On 8/28/13 11:12 AM, Rama wrote:


Hello,

At NPT stage the two leaflets in DMPC bilayer is separated a while and comes
closer. Is this common in this stage or any thing goes wrong in
equillibration.



Depending on what the previous preparation steps were, this can certainly occur.

-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Albert

Hello:

I am constraining one part of the protein and trying to generate md.tpr 
with command:


grompp -f md.mdp -c npt4.gro -n -o md.tpr

it works fine in 4.6.3, but it failed in 4.5.5 with following warning 
messages:



WARNING 1 [file md.mdp, line 65]:
  Unknown left-hand 'cutoff-scheme' in parameter file
WARNING 2 [file helix.itp, line 1]:
  Too few parameters on line (source file toppush.c, line 1501)
WARNING 3 [file md.mdp]:
  The sum of the two largest charge group radii (13.715767) is larger than
  rlist (1.00)

There were 3 notes
There were 3 warnings

---
Program grompp, VERSION 4.5.5
Source code file: grompp.c, line: 1584

Does anybody have any idea?

thx

Albert

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] problem of submitting job in HPC

2013-08-28 Thread Albert

Hello:

 I am trying to use following command to run 4.6.3 in a HPC cluster:

mpiexec -n 32 /opt/gromacs/4.6.3/bin/mdrun_mpi  -dlb yes -v -s md.tpr -x 
md.xtc -o md.trr -g md.log -e md.edr  >& md.info


the 4.5.5 works fine in this machine with command:

mpiexec -n 32 mdrun -nosum -dlb yes -v -s md.tpr -x md.xtc -o md.trr -g 
md.log -e md.edr  >& md.info


the difference is that the option "-nosum" is not available in 4.6.3

but 4.6.3 always failed. It generate a lot of similar files and log 
informations. It looks like use mpiexec evoke serial mdrun.


Does anybody have any idea?

thank you very much.

best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] problem of submitting job in HPC

2013-08-28 Thread Justin Lemkul



On 8/28/13 12:39 PM, Albert wrote:

Hello:

  I am trying to use following command to run 4.6.3 in a HPC cluster:

mpiexec -n 32 /opt/gromacs/4.6.3/bin/mdrun_mpi  -dlb yes -v -s md.tpr -x md.xtc
-o md.trr -g md.log -e md.edr  >& md.info

the 4.5.5 works fine in this machine with command:

mpiexec -n 32 mdrun -nosum -dlb yes -v -s md.tpr -x md.xtc -o md.trr -g md.log
-e md.edr  >& md.info

the difference is that the option "-nosum" is not available in 4.6.3

but 4.6.3 always failed. It generate a lot of similar files and log
informations. It looks like use mpiexec evoke serial mdrun.

Does anybody have any idea?



Have you verified that the 4.6.3 mdrun was correctly installed such that it can 
make use of MPI?


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Justin Lemkul



On 8/28/13 11:48 AM, Albert wrote:

Hello:

I am constraining one part of the protein and trying to generate md.tpr with
command:

grompp -f md.mdp -c npt4.gro -n -o md.tpr

it works fine in 4.6.3, but it failed in 4.5.5 with following warning messages:


WARNING 1 [file md.mdp, line 65]:
   Unknown left-hand 'cutoff-scheme' in parameter file


Makes sense; this option did not exist before version 4.6.


WARNING 2 [file helix.itp, line 1]:
   Too few parameters on line (source file toppush.c, line 1501)


Looks concerning - what's line 1?


WARNING 3 [file md.mdp]:
   The sum of the two largest charge group radii (13.715767) is larger than
   rlist (1.00)



How big is your box?  This may very well be a simple periodicity issue.

http://www.gromacs.org/Documentation/Errors#The_sum_of_the_two_largest_charge_group_radii_(X)_is_larger_than.c2.a0rlist_-_rvdw.2frcoulomb

-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Albert

On 08/28/2013 07:07 PM, Justin Lemkul wrote:

WARNING 2 [file helix.itp, line 1]:
   Too few parameters on line (source file toppush.c, line 1501)


Looks concerning - what's line 1?

here is the initial lines:

; position restraints for part of C-alpha of Protein

[ position_restraints ]
;  i funct   fcxfcyfcz
   51300300300
  241300300300






WARNING 3 [file md.mdp]:
   The sum of the two largest charge group radii (13.715767) is 
larger than

   rlist (1.00)



How big is your box?  This may very well be a simple periodicity issue.

http://www.gromacs.org/Documentation/Errors#The_sum_of_the_two_largest_charge_group_radii_(X)_is_larger_than.c2.a0rlist_-_rvdw.2frcoulomb 



-Justin 


I've see this informations from gromacs website. my box is:

   6.96418   6.96418   9.77176

in the last line of my input .gro file.

I don't understand, why this warning not appear in 4.6.3, but failed in 
4.5.5


thank you very much

Albert

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Justin Lemkul



On 8/28/13 1:21 PM, Albert wrote:

On 08/28/2013 07:07 PM, Justin Lemkul wrote:

WARNING 2 [file helix.itp, line 1]:
   Too few parameters on line (source file toppush.c, line 1501)


Looks concerning - what's line 1?

here is the initial lines:

; position restraints for part of C-alpha of Protein

[ position_restraints ]
;  i funct   fcxfcyfcz
51300300300
   241300300300




Looks normal, so without context of how it is #included, there's not much to 
diagnose here.







WARNING 3 [file md.mdp]:
   The sum of the two largest charge group radii (13.715767) is larger than
   rlist (1.00)



How big is your box?  This may very well be a simple periodicity issue.

http://www.gromacs.org/Documentation/Errors#The_sum_of_the_two_largest_charge_group_radii_(X)_is_larger_than.c2.a0rlist_-_rvdw.2frcoulomb


-Justin


I've see this informations from gromacs website. my box is:

6.96418   6.96418   9.77176

in the last line of my input .gro file.

I don't understand, why this warning not appear in 4.6.3, but failed in 
4.5.5



The misinterpretation of periodicity in warning 3 was a bug that was fixed at 
some point, so the fact that it doesn't show up in 4.6.3 makes sense.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Albert

On 08/28/2013 07:25 PM, Justin Lemkul wrote:
Looks normal, so without context of how it is #included, there's not 
much to diagnose here. 



here is my #include in topol.top file:

 ; Include Position restraint file
#ifdef POSRES
#include "restrain.itp"
#endif


I first generate restrain for all the protein CA by genrest command, 
after that I delete the atoms which I don't want to restrain from output 
restrain.itp. Probably that's the problem? Maybe I should set the 
unwanted restrain atoms into 0 force constant instead of delete them 
from restrain.itp file?


thx a lot

Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Justin Lemkul



On 8/28/13 1:36 PM, Albert wrote:

On 08/28/2013 07:25 PM, Justin Lemkul wrote:

Looks normal, so without context of how it is #included, there's not much to
diagnose here.



here is my #include in topol.top file:

  ; Include Position restraint file
#ifdef POSRES
#include "restrain.itp"
#endif


I first generate restrain for all the protein CA by genrest command, after that
I delete the atoms which I don't want to restrain from output restrain.itp.
Probably that's the problem? Maybe I should set the unwanted restrain atoms into
0 force constant instead of delete them from restrain.itp file?



That's not the problem.  It's complaining about whatever is on line 1 (not clear 
from the previous message if the comment line is #1 or a blank line), so 
assuming that the #ifdef is in the right place (probably is, or the error would 
be different), it's possible that there's some weird hidden character that is 
causing the error.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] work in 4.5.5 but failed in 4.6.1

2013-08-28 Thread Albert

On 08/28/2013 07:38 PM, Justin Lemkul wrote:


That's not the problem.  It's complaining about whatever is on line 1 
(not clear from the previous message if the comment line is #1 or a 
blank line), so assuming that the #ifdef is in the right place 
(probably is, or the error would be different), it's possible that 
there's some weird hidden character that is causing the error.


-Justin



IC. the problem solved when I run command dos2unix.

thank you very much.

Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] problem of submitting job in HPC

2013-08-28 Thread Mark Abraham
On Wed, Aug 28, 2013 at 7:06 PM, Justin Lemkul  wrote:
>
>
> On 8/28/13 12:39 PM, Albert wrote:
>>
>> Hello:
>>
>>   I am trying to use following command to run 4.6.3 in a HPC cluster:
>>
>> mpiexec -n 32 /opt/gromacs/4.6.3/bin/mdrun_mpi  -dlb yes -v -s md.tpr -x
>> md.xtc
>> -o md.trr -g md.log -e md.edr  >& md.info
>>
>> the 4.5.5 works fine in this machine with command:
>>
>> mpiexec -n 32 mdrun -nosum -dlb yes -v -s md.tpr -x md.xtc -o md.trr -g
>> md.log
>> -e md.edr  >& md.info
>>
>> the difference is that the option "-nosum" is not available in 4.6.3
>>
>> but 4.6.3 always failed. It generate a lot of similar files and log
>> informations. It looks like use mpiexec evoke serial mdrun.
>>
>> Does anybody have any idea?
>>
>
> Have you verified that the 4.6.3 mdrun was correctly installed such that it
> can make use of MPI?

i.e. by inspecting the top of the .log file, or the output of mdrun -version?

Mark
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Gianluca Interlandi
Thanks for your replies, Mark. What do you think about the current 
DispCorr option in gromacs? Is it worth it trying it? Also, I wonder 
whether using DispCorr for LJ + PME for Cb justifies reducing the cutoff 
for non-bonded to 1 nm with the CHARMM force field, where 1.2 nm is 
usually recommended.


Gianluca

On Wed, 28 Aug 2013, Mark Abraham wrote:


Secondarily, one could use the same cut-off for LJ and electrostatics,
and treat their respective lattice components however you like. This
simplifies implementations for computing short-ranged interactions,
while facilitating iso-accuracy load balancing across heterogeneous
compute units.

Mark

On Wed, Aug 28, 2013 at 10:08 AM, David van der Spoel
 wrote:

On 2013-08-28 09:31, rajat desikan wrote:


Hi,
What is LJ PME? I googled it and got this publication?
http://pubs.acs.org/doi/abs/10.1021/ct400146w

So, LJ will not be cut off at some r, but you will have a real+fourier
part
similar to electrostatics. Is that LJ PME? What are the advantages?


http://pubs.acs.org/doi/abs/10.1021/ct400140n



On Wed, Aug 28, 2013 at 12:36 PM, Mark Abraham
wrote:


Lennard-Jones PME is planned for 5.0

Mark
On Aug 28, 2013 8:36 AM, "Gianluca Interlandi"

wrote:


Hi!

Just wondering whether gromacs has (or plans to implement) a correction
for the loss of long range LJ interactons? Something similar to
LJcorrection in NAMD or IPS in CHARMM.

Thanks!

  Gianluca

--**---
Gianluca Interlandi, PhD gianl...@u.washington.edu
 +1 (206) 685 4435
 http://artemide.bioeng.**washington.edu/<


http://artemide.bioeng.washington.edu/>



Research Scientist at the Department of Bioengineering
at the University of Washington, Seattle WA U.S.A.
 http://healthynaturalbaby.org
--**---
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/**mailman/listinfo/gmx-users<


http://lists.gromacs.org/mailman/listinfo/gmx-users>


* Please search the archive at http://www.gromacs.org/**
Support/Mailing_Lists/Search<


http://www.gromacs.org/Support/Mailing_Lists/Search>before posting!


* Please don't post (un)subscribe requests to the list. Use the www
interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/**Support/Mailing_Lists<


http://www.gromacs.org/Support/Mailing_Lists>




--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists








--
David van der Spoel, Ph.D., Professor of Biology
Dept. of Cell & Molec. Biol., Uppsala University.
Box 596, 75124 Uppsala, Sweden. Phone:  +46184714205.
sp...@xray.bmc.uu.sehttp://folding.bmc.uu.se

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the www
interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-
Gianluca Interlandi, PhD gianl...@u.washington.edu
+1 (206) 685 4435
http://artemide.bioeng.washington.edu/

Research Scientist at the Department of Bioengineering
at the University of Washington, Seattle WA U.S.A.
http://healthynaturalbaby.org
-
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Justin Lemkul



On 8/28/13 7:28 PM, Gianluca Interlandi wrote:

Thanks for your replies, Mark. What do you think about the current DispCorr
option in gromacs? Is it worth it trying it? Also, I wonder whether using
DispCorr for LJ + PME for Cb justifies reducing the cutoff for non-bonded to 1
nm with the CHARMM force field, where 1.2 nm is usually recommended.



This is risky.  Current CHARMM development relies on a 1.2-nm cutoff for LJ, so 
that's how we balance all of the forces during parameterization.  To me, ad hoc 
changes like these are not worth the tiny (potential) increase in performance. 
As I recently told someone else on this topic, if you're intent on fiddling with 
the typical workings of a force field, especially if you're making changes to 
something so fundamental, be prepared to undertake a demonstration that you can 
recapitulate all of the expected outcomes of the force field or improve upon 
them.  My gut feeling, in this case and others, is that you won't be able to. 
You're messing with something that is fairly critical to obtaining sensible results.


As for dispersion correction, it is generally helpful, but it assumes a 
homogeneous environment.  If you simulate with a membrane, for instance, this 
assumption breaks down, though some literature suggests that use of dispersion 
correction in these cases is still better than nothing.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] CGennFF in gromacs

2013-08-28 Thread Golshan Hejazi
Hello everyone,

I want to use CGennFF force field in GROMACS. I downloaded the Cgenffbon.itp 
and  Cgenffnb.itp files and I put them in charmm36.ff directory. 

- I replaced the lines in the forcefield.itp to #include "Cgenffbon.itp"  and 
#include "Cgenffnb.itp" 
- I modified the rtp file and I inserted the corresponding atom, bond, dihedral 
terms as in charmm rtf file for paracetamol in the rtp file in gromacs 
- I inserted the new atomtypes at the end of the atomtype.atp file ... I took 
the atom types from prm file of charmm
- I introduced the new residue name to the residuetype file
- with pdb2gmx ... I could generate the topol.top file without any error

However, when I am using grompp, it give me the following error:

Fatal error:
Atomtype CG331 not found
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors


I already have this atomtype inside the atomtype file ... and when I was 
pdb2gmx ... it did not complain ! Can you help me to figure out what is going 
on?

Thanks
G.
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] CGennFF in gromacs

2013-08-28 Thread Justin Lemkul



On 8/28/13 8:23 PM, Golshan Hejazi wrote:

Hello everyone,

I want to use CGennFF force field in GROMACS. I downloaded the Cgenffbon.itp 
and  Cgenffnb.itp files and I put them in charmm36.ff directory.

- I replaced the lines in the forcefield.itp to #include "Cgenffbon.itp"  and #include 
"Cgenffnb.itp"
- I modified the rtp file and I inserted the corresponding atom, bond, dihedral 
terms as in charmm rtf file for paracetamol in the rtp file in gromacs
- I inserted the new atomtypes at the end of the atomtype.atp file ... I took 
the atom types from prm file of charmm
- I introduced the new residue name to the residuetype file
- with pdb2gmx ... I could generate the topol.top file without any error

However, when I am using grompp, it give me the following error:

Fatal error:
Atomtype CG331 not found
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors


I already have this atomtype inside the atomtype file ... and when I was 
pdb2gmx ... it did not complain ! Can you help me to figure out what is going 
on?



That is the only time atomtypes.atp is used.  The error from grompp comes 
because it can't find CG331 in CGenffnb.itp, so check that file carefully.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Gianluca Interlandi
Current CHARMM development relies on a 1.2-nm cutoff for LJ, so that's 
how we balance all of the forces during parameterization.


Ok, I agree. What about the use of PME for Coulomb? The CHARMM PARAM22 
force field was parametrized using SHIFT on electrostatic forces making it 
zero after 12 A (please note that SHIFT and FSHIFT in CHARMM greatly 
differ from the gromacs-style Shift). Also, the CHARMM switch function is 
different from gromacs Switch (at least from what I can tell reading pages 
71-72 of the gromacs manual 4.6.3).


Thanks,

 Gianluca

ad hoc changes like these are not worth the tiny (potential) increase in 
performance. As I recently told someone else on this topic, if you're intent 
on fiddling with the typical workings of a force field, especially if you're 
making changes to something so fundamental, be prepared to undertake a 
demonstration that you can recapitulate all of the expected outcomes of the 
force field or improve upon them.  My gut feeling, in this case and others, 
is that you won't be able to. You're messing with something that is fairly 
critical to obtaining sensible results.


As for dispersion correction, it is generally helpful, but it assumes a 
homogeneous environment.  If you simulate with a membrane, for instance, this 
assumption breaks down, though some literature suggests that use of 
dispersion correction in these cases is still better than nothing.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the www interface 
or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-
Gianluca Interlandi, PhD gianl...@u.washington.edu
+1 (206) 685 4435
http://artemide.bioeng.washington.edu/

Research Scientist at the Department of Bioengineering
at the University of Washington, Seattle WA U.S.A.
http://healthynaturalbaby.org
-
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Justin Lemkul



On 8/28/13 9:09 PM, Gianluca Interlandi wrote:

Current CHARMM development relies on a 1.2-nm cutoff for LJ, so that's how we
balance all of the forces during parameterization.


Ok, I agree. What about the use of PME for Coulomb? The CHARMM PARAM22 force
field was parametrized using SHIFT on electrostatic forces making it zero after
12 A (please note that SHIFT and FSHIFT in CHARMM greatly differ from the
gromacs-style Shift). Also, the CHARMM switch function is different from gromacs
Switch (at least from what I can tell reading pages 71-72 of the gromacs manual
4.6.3).



PME is generally vastly better than switched or shifted electrostatics.  This 
has been demonstrated in the literature consistently over many years.


-Justin


Thanks,

  Gianluca


ad hoc changes like these are not worth the tiny (potential) increase in
performance. As I recently told someone else on this topic, if you're intent
on fiddling with the typical workings of a force field, especially if you're
making changes to something so fundamental, be prepared to undertake a
demonstration that you can recapitulate all of the expected outcomes of the
force field or improve upon them.  My gut feeling, in this case and others, is
that you won't be able to. You're messing with something that is fairly
critical to obtaining sensible results.

As for dispersion correction, it is generally helpful, but it assumes a
homogeneous environment.  If you simulate with a membrane, for instance, this
assumption breaks down, though some literature suggests that use of dispersion
correction in these cases is still better than nothing.

-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the www interface
or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-
Gianluca Interlandi, PhD gianl...@u.washington.edu
 +1 (206) 685 4435
 http://artemide.bioeng.washington.edu/

Research Scientist at the Department of Bioengineering
at the University of Washington, Seattle WA U.S.A.
 http://healthynaturalbaby.org
-


--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Long range Lennard Jones

2013-08-28 Thread Gianluca Interlandi

Justin,

I respect your opinion on this. However, in the paper indicated below by 
BR Brooks they used a cutoff of 10 A on LJ when testing IPS in CHARMM:


Title: Pressure-based long-range correction for Lennard-Jones interactions 
in molecular dynamics simulations: Application to alkanes and interfaces

Author(s): Lague, P; Pastor, RW; Brooks, BR
Source: JOURNAL OF PHYSICAL CHEMISTRY B Volume: 108 Issue: 1 Pages: 
363-368 Published: JAN 8 2004


There is also a paper by Piana and Shaw where different cutoffs for 
non-bonded are tested with CHARMM22 on Anton:


http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0039918

They found some subtle differences, in particular for cutoffs shorter than 
9 A. However, Anton uses abrupt truncation (no switching) and I believe 
that the differences they found at cutoffs > 9 A would be much smaller if 
they had used a finer mesh (as they show at the 8 A cutoff). I always use 
fourierspacing=1.0


I agree though that it strongly depends on the system and I have always 
run control simulations but never found significant differences in the 
case of just proteins.


Finally, I have not tested it in gromacs, but in NAMD there is a 
performance gain of 25% when using the shorter cutoff. This is a factor to 
consider. When I asked for Teragrid supercomputing allocations back in 
2006 and 2007 and I suggested 10/12/14 cutoff, the reviewers always 
complained and cut my requested time of 20% with the justification that I 
must use a shorter cutoff.


Gianluca

On Wed, 28 Aug 2013, Justin Lemkul wrote:


On 8/28/13 7:28 PM, Gianluca Interlandi wrote:

Thanks for your replies, Mark. What do you think about the current DispCorr
option in gromacs? Is it worth it trying it? Also, I wonder whether using
DispCorr for LJ + PME for Cb justifies reducing the cutoff for non-bonded 
to 1

nm with the CHARMM force field, where 1.2 nm is usually recommended.



This is risky.  Current CHARMM development relies on a 1.2-nm cutoff for LJ, 
so that's how we balance all of the forces during parameterization.  To me, 
ad hoc changes like these are not worth the tiny (potential) increase in 
performance. As I recently told someone else on this topic, if you're intent 
on fiddling with the typical workings of a force field, especially if you're 
making changes to something so fundamental, be prepared to undertake a 
demonstration that you can recapitulate all of the expected outcomes of the 
force field or improve upon them.  My gut feeling, in this case and others, 
is that you won't be able to. You're messing with something that is fairly 
critical to obtaining sensible results.


As for dispersion correction, it is generally helpful, but it assumes a 
homogeneous environment.  If you simulate with a membrane, for instance, this 
assumption breaks down, though some literature suggests that use of 
dispersion correction in these cases is still better than nothing.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalem...@outerbanks.umaryland.edu | (410) 706-7441

==
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the www interface 
or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-
Gianluca Interlandi, PhD gianl...@u.washington.edu
+1 (206) 685 4435
http://artemide.bioeng.washington.edu/

Research Scientist at the Department of Bioengineering
at the University of Washington, Seattle WA U.S.A.
http://healthynaturalbaby.org
-
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] ERROR : GROMACS finsihed with error 74

2013-08-28 Thread sri2201
Dear Gromacs , 

I running the md simulation for protein complex (44 kd ) with amber99sb-ildn
force filed . iam getting error as follows , looks like it is syntax error .

Input file: gmx-495644.pdb Base name: gmx-495644 Source directory:
/scratch/home/enmr028/home_cream_840250368/CREAM840250368 GROMACS mdrun will
run for a maximum of hours. (voms proxy lifetime 0 sec.) Starting MD
protocol for
/scratch/home/enmr028/home_cream_840250368/CREAM840250368/gmx-495644.pdb
Using amber force field amber99sb-ildn with tip3p water model Using PME for
treatment of long range coulomb interactions Not using virtual sites
Simulations will be performed in a rhombic dodecahedron unit cell 750
#---= THIS IS WHERE WE START =-- #---STEP 1A: GENERATE STRUCTURE AND
TOPOLOGY FOR INPUT PDB FILE Wed Aug 28 17:54:19 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/pdb2gmx -v -f
/scratch/home/enmr028/home_cream_840250368/CREAM840250368/gmx-495644.pdb -o
gmx-495644.gro -p gmx-495644.top -ignh -ff amber99sb-ildn -water tip3p -i
gmx-495644-posre.itp -posrefc 999 Removing duplicate moleculetype definition
in gmx-495644_Protein_chain_D.itp #---STEP 1B: ADD LIGANDS SPECIFIED ON
COMMAND LINE #---STEP 2: SET PERIODIC BOUNDARY CONDITIONS Wed Aug 28
17:54:21 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/editconf -f
gmx-495644.gro -o gmx-495644-pbc.gro -bt dodecahedron -d 1.125 -c #---STEP
3: RUN EM IN VACUUM Wed Aug 28 17:54:21 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/grompp -f
em-vac.mdp -po em-vac-out.mdp -c gmx-495644-pbc.gro -p gmx-495644.top -o
gmx-495644-EMv.tpr -maxwarn 1 Wed Aug 28 17:54:21 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/mdrun -nice 0
-deffnm gmx-495644-EMv -c gmx-495644-EMv.gro -cpi gmx-495644-EMv.cpt -nt 1
-maxh 12000 Wed Aug 28 17:54:29 BST 2013: FINISHED MDRUN EMVACUUM #---STEP
4: SOLVATION AND ADDING IONS Wed Aug 28 17:54:29 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/genbox -cp
gmx-495644-EMv.gro -cs -o gmx-495644-sol-b4ions.gro Solvent added: 25839
molecules Wed Aug 28 17:54:30 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/grompp -v -f
empty.mdp -c gmx-495644-sol-b4ions.gro -p gmx-495644-sol-b4ions.top -o
gmx-495644-sol-b4ions.tpr -po defaults.mdp -maxwarn 1 Net charge of system:
6 Replacing 142 solvent molecules with 68 NA (1) and 74 CL (-1) ions. Wed
Aug 28 17:54:33 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/genion -s
gmx-495644-sol-b4ions.tpr -o gmx-495644-sol.gro -g genion.log -n sol.ndx -pq
1 -nn 74 -np 68 -pname NA -nname CL -nq -1 -rmin 0.5 -random #---STEP 5:
ENERGY MINIMIZATION IN SOLVENT (NVT) Wed Aug 28 17:54:38 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/grompp -f
em-sol.mdp -po em-sol-out.mdp -c gmx-495644-sol.gro -p gmx-495644-sol.top -o
gmx-495644-EMs.tpr -maxwarn 1 Wed Aug 28 17:54:39 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/mdrun -nice 0
-deffnm gmx-495644-EMs -c gmx-495644-EMs.gro -cpi gmx-495644-EMs.cpt -nt 1
-maxh 12000 Wed Aug 28 17:56:49 BST 2013: FINISHED MDRUN EMSOLVENT #---STEP
6: POSITION RESTRAINT MD, NVT -- CYCLE THROUGH PRFC AND TEMP/TAU_T Solute
Solvent Equilibration (NVT/PR): Temperatures: 300.0 Coupling times: 0.1
Position restraint Fcs: 200 200 200 NVT Equilibration at 300.0 Kelvin
(tau_t=0.1) with Position restraint force 200 Wed Aug 28 17:56:54 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/grompp -f
pr-200-nvt-300.0-0.1.mdp -po pr-200-nvt-300.0-0.1-out.mdp -c
gmx-495644-EMs.gro -p gmx-495644-sol.top -n gmx-495644-sol.ndx -o
gmx-495644-PR-200-NVT-300.0-0.1.tpr -maxwarn 1 -r gmx-495644-EMs.gro Wed Aug
28 17:56:55 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/mdrun -nice 0
-deffnm gmx-495644-PR-200-NVT-300.0-0.1 -c
gmx-495644-PR-200-NVT-300.0-0.1.gro -cpi gmx-495644-PR-200-NVT-300.0-0.1.cpt
-nt 1 -maxh 12000 Thu Aug 29 03:04:27 BST 2013: FINISHED MDRUN NVT-PR 200
NVT Equilibration at 300.0 Kelvin (tau_t=0.1) with Position restraint force
200 Output found (gmx-495644-PR-200-NVT-300.0-0.1.gro). Skipping step NVT-PR
#---STEP 7: UNRESTRAINED MD 20 ps NPT -- CYCLE THROUGH PRESSURE/TAU_P
Equilibration (NpT): Pressures: 1.01325 Coupling times: 0.5 NpT
Equilibration at 1.01325 bar (tau_p=0.5) Thu Aug 29 03:05:00 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/grompp -f
npt-1.01325-0.5.mdp -po npt-1.01325-0.5-out.mdp -c
gmx-495644-PR-200-NVT-300.0-0.1.gro -p gmx-495644-sol.top -n
gmx-495644-sol.ndx -o gmx-495644-NPT-1.01325-0.5.tpr -maxwarn 1 -r
gmx-495644-EMs.gro Thu Aug 29 03:05:01 BST 2013:
/mt/experiment-software//enmr/BCBR/gromacs/4.5.3-rtc/bin/mdrun -nice 0
-deffnm gmx-495644-NPT-1.01325-0.5 -c gmx-495644-NPT-1.01325-0.5.gro -cpi
gmx-495644-NPT-1.01325-0.5.cpt -nt 1 -maxh 12000 Thu Aug 29 06:04:39 2013 

MDS ERROR : GROMACS finsihed with error 74,

Please suggest me to aviod this bug or my input errors .

Thanking you in

[gmx-users] MD vs. free energy simulations

2013-08-28 Thread Jernej Zidar
Hi,
  I ran some MD simulations (NPT ensemble) and a series of simulations
to determine the free energy of water solvation of a not to big
molecule.

  I noticed that while I was able to run the MD simulations using all
the CPUs (or threads) in my workstation (12 CPUs or 24 threads,
respectively), whereas during free energy I can use 2 CPUs at most. If
I try to use more, the simulations would crash stating:
Program mdrun, VERSION 4.6.3
Source code file: /home/zidar/utils/gromacs-4.6.3/src/mdlib/domdec.c, line: 6792

Fatal error:
There is no domain decomposition for 4 nodes that is compatible with
the given box and a minimum cell size of 4.52667 nm
Change the number of nodes or mdrun option -rdd or -dds
Look in the log file for details on the domain decomposition
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

- - - -

  I can use more CPUs only if I switch from domain decomposition to
particle decomposition scheme. The size of the systems evaluated is
11.92513nm  x  5.44212nm  x 5.35234nm with ~30.000 atoms, so I assume
the size of the system is not an issue.

  Big question: Why is that so? Why can I use more CPUs for 'regular'
MD but only two for free energy simulations?

  I'd really like to use more CPUs as it would really speed-up the simulations.

Thanks in advance,
Jernej Zidar
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] problem of submitting job in HPC

2013-08-28 Thread Albert
hello Mark:

 thanks a lot for kind advices. Here is my log file for output of mdrun
-version, There are always some duplicate informations and files



Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Program: mdrun_mpi
Gromacs version:VERSION 4.6.3
Precision:  single
Memory model:   64 bit
MPI library:MPI
OpenMP support: enabled
GPU support:disabled
invsqrt routine:(1.0/__sqrt(x))
CPU acceleration:   NONE
FFT library:fftw-3.3.2-fma
Large file support: enabled
RDTSCP usage:   disabled
Built on:   Thu Aug 29 02:03:12 CEST 2013
Built by:   sheed@c2n25-hf0 [CMAKE]
Gromacs version:VERSION 4.6.3
Precision:  single
Memory model:   64 bit
MPI library:MPI
OpenMP support: enabled
GPU support:disabled
invsqrt routine:(1.0/__sqrt(x))
CPU acceleration:   NONE
FFT library:fftw-3.3.2-fma
Large file support: enabled
RDTSCP usage:   disabled
Built on:   Thu Aug 29 02:03:12 CEST 2013
Built by:   sheed@c2n25-hf0 [CMAKE]
Build OS/arch:  AIX 1 00CCE0564C00
Build CPU vendor:   Unknown
Build CPU brand:Unknown CPU brand
Build CPU family:   0   Model: 0   Stepping: 0
Build CPU features: CannotDetect
C compiler: /usr/bin/mpcc_r XL mpcc_r
C compiler flags:  -qlanglvl=extc99 -qarch=auto -qtune=auto
-qthreaded -qalias=noansi -qhalt=e  -O3 -qstrict -qsimd=auto -qmaxmem=-1 -qa
rch=pwr7 -qtune=pwr7
Build OS/arch:  AIX 1 00CCE0564C00
Build CPU vendor:   Unknown
Build CPU brand:Unknown CPU brand
Build CPU family:   0   Model: 0   Stepping: 0
Build CPU features: CannotDetect
C compiler: /usr/bin/mpcc_r XL mpcc_r
C compiler flags:  -qlanglvl=extc99 -qarch=auto -qtune=auto
-qthreaded -qalias=noansi -qhalt=e  -O3 -qstrict -qsimd=auto -qmaxmem=-1 -qa
rch=pwr7 -qtune=pwr7
Gromacs version:VERSION 4.6.3
Precision:  single
Memory model:   64 bit
MPI library:MPI
OpenMP support: enabled
GPU support:disabled
invsqrt routine:(1.0/__sqrt(x))
CPU acceleration:   NONE
FFT library:fftw-3.3.2-fma
Large file support: enabled
RDTSCP usage:   disabled
Built on:   Thu Aug 29 02:03:12 CEST 2013
Built by:   sheed@c2n25-hf0 [CMAKE]
Build OS/arch:  AIX 1 00CCE0564C00
Build CPU vendor:   Unknown
Build CPU brand:Unknown CPU brand
Build CPU family:   0   Model: 0   Stepping: 0
Build CPU features: CannotDetect
C compiler: /usr/bin/mpcc_r XL mpcc_r
C compiler flags:  -qlanglvl=extc99 -qarch=auto -qtune=auto
-qthreaded -qalias=noansi -qhalt=e  -O3 -qstrict -qsimd=auto -qmaxmem=-1 -qa
rch=pwr7 -qtune=pwr7
Gromacs version:VERSION 4.6.3
Precision:  single
Memory model:   64 bit
MPI library:MPI
OpenMP support: enabled
GPU support:disabled
invsqrt routine:(1.0/__sqrt(x))
CPU acceleration:   NONE
FFT library:fftw-3.3.2-fma
Large file support: enabled
RDTSCP usage:   disabled
Built on:   Thu Aug 29 02:03:12 CEST 2013
Built by:   sheed@c2n25-hf0 [CMAKE]
Build OS/arch:  AIX 1 00CCE0564C00
Build CPU vendor:   Unknown
Build CPU brand:Unknown CPU brand
Build CPU family:   0   Model: 0   Stepping: 0
Build CPU features: CannotDetect
C compiler: /usr/bin/mpcc_r XL mpcc_r
C compiler flags:  -qlanglvl=extc99 -qarch=auto -qtune=auto
-qthreaded -qalias=noansi -qhalt=e  -O3 -qstrict -qsimd=auto -qmaxmem=-1 -qa
rch=pwr7 -qtune=pwr7
Gromacs version:VERSION 4.6.3
Precision:  single
Memory model:   64 bit
MPI library:MPI
.
.
.





2013/8/28 Mark Abraham 

> On Wed, Aug 28, 2013 at 7:06 PM, Justin Lemkul  wrote:
> >
> >
> > On 8/28/13 12:39 PM, Albert wrote:
> >>
> >> Hello:
> >>
> >>   I am trying to use following command to run 4.6.3 in a HPC cluster:
> >>
> >> mpiexec -n 32 /opt/gromacs/4.6.3/bin/mdrun_mpi  -dlb yes -v -s md.tpr -x
> >> md.xtc
> >> -o md.trr -g md.log -e md.edr  >& md.info
> >>
> >> the 4.5.5 works fine in this machine with command:
> >>
> >> mpiexec -n 32 mdrun -nosum -dlb yes -v -s md.tpr -x md.xtc -o md.trr -g
> >> md.log
> >> -e md.edr  >& md.info
> >>
> >> the difference is that the option "-nosum" is not available in 4.6.3
> >>
> >> but 4.6.3 always failed. It generate a lot of similar files and log
> >> informations. It looks like use mpiexec evoke serial mdrun.
> >>
> >> Does anybody have any idea?
> >>
> >
> > Have you verified that the 4.6.3 mdrun was correctly installed such that
> it
> > can make use of MPI?
>
> i.