Re: mdrun on 8-core AMD + GTX TITAN (was: Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs)

2013-11-12 Thread Dwey Kauffman
rpose when cutoff >1.6 but the total performance (ns/day) decreases severely. That's NOT what I want because I would like to assign 0.8 or 1.0 or 1.2 nm to cutoffs for general purposes. In this tested case, I am testing Justin's umbrella's pull-code in his tutorial. I should a

Re: mdrun on 8-core AMD + GTX TITAN (was: Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs)

2013-11-12 Thread Szilárd Páll
24 10025398.693 173338.93624.1 > PME 3D-FFT 24 10022798.48289852.482 12.5 > PME 3D-FFT Comm. 24 1002 947.03330406.937 4.2 > PME solve 24501 420.66713506.611 1.9 > -

Re: mdrun on 8-core AMD + GTX TITAN (was: Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs)

2013-11-10 Thread Mark Abraham
t it. To hope to see some scaling, you'd need to be able to drop the PME mesh time by about a factor of two (coarser grid, and compensating increase to rcoulomb), and hope there was enough PP work that using two GPUs for a single simulation is even worth considering. Achieving throughput-style scal

Re: mdrun on 8-core AMD + GTX TITAN (was: Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs)

2013-11-09 Thread Dwey Kauffman
Time: 178961.45022398.880 799.0 6h13:18 (ns/day)(hour/ns) Performance: 38.5730.622 -- View this message in context: http://gromacs.5086.x6.nabble.com/mdrun-on-8-core-AMD-GTX-TITAN-was-Re-gmx-users-Re-Gromacs-4-6-on-t

Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-07 Thread Szilárd Páll
On Thu, Nov 7, 2013 at 6:34 AM, James Starlight wrote: > I've gone to conclusion that simulation with 1 or 2 GPU simultaneously gave > me the same performance > mdrun -ntmpi 2 -ntomp 6 -gpu_id 01 -v -deffnm md_CaM_test, > > mdrun -ntmpi 2 -ntomp 6 -gpu_id 0 -v -deffnm md_CaM_test, > > Doest it b

mdrun on 8-core AMD + GTX TITAN (was: Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs)

2013-11-07 Thread Szilárd Páll
Let's not hijack James' thread as your hardware is different from his. On Tue, Nov 5, 2013 at 11:00 PM, Dwey Kauffman wrote: > Hi Szilard, > >Thanks for your suggestions. I am indeed aware of this page. In a 8-core > AMD with 1GPU, I am very happy about its performance. See below. My Actual

Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-07 Thread Mark Abraham
First, there is no value in ascribing problems to the hardware if the simulation setup is not yet balanced, or not large enough to provide enough atoms and long enough rlist to saturate the GPUs, etc. Look at the log files and see what complaints mdrun makes about things like PME load balance, and

Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-06 Thread James Starlight
I've gone to conclusion that simulation with 1 or 2 GPU simultaneously gave me the same performance mdrun -ntmpi 2 -ntomp 6 -gpu_id 01 -v -deffnm md_CaM_test, mdrun -ntmpi 2 -ntomp 6 -gpu_id 0 -v -deffnm md_CaM_test, Doest it be due to the small CPU cores or addition RAM ( this system has 32 gb

Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-06 Thread Richard Broadbent
Hi Dwey, On 05/11/13 22:00, Dwey Kauffman wrote: Hi Szilard, Thanks for your suggestions. I am indeed aware of this page. In a 8-core AMD with 1GPU, I am very happy about its performance. See below. My intention is to obtain a even better one because we have multiple nodes. ### 8 core AMD

[gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread Dwey Kauffman
Hi Szilard, Thanks for your suggestions. I am indeed aware of this page. In a 8-core AMD with 1GPU, I am very happy about its performance. See below. My intention is to obtain a even better one because we have multiple nodes. ### 8 core AMD with 1 GPU, Force evaluation time GPU/CPU: 4.006 ms

Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread Szilárd Páll
Hi Dwey, First and foremost, make sure to read the http://www.gromacs.org/Documentation/Acceleration_and_parallelization page, in particular the "Multiple MPI ranks per GPU" section which applies in your case. Secondly, please do post log files (pastebin is your friend), the performance table at

[gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread Dwey
Hi Mike, I have similar configurations except a cluster of AMD-based linux platforms with 2 GPU cards. Your suggestion works. However, the performance of 2 GPU discourages me because , for example, with 1 GPU, our computer node can easily obtain a simulation of 31ns/day for a protein of

Re: [gmx-users] Re: Gromacs: GPU detection

2013-09-13 Thread Szilárd Páll
FYI, I've file a bug report which you can track if interested: http://redmine.gromacs.org/issues/1334 -- Szilárd On Sun, Sep 1, 2013 at 9:49 PM, Szilárd Páll wrote: > I may have just come across this issue as well. I have no time to > investigate, but my guess is that it's related to some thread

Re: [gmx-users] Re: Gromacs: GPU detection

2013-09-01 Thread Szilárd Páll
I may have just come across this issue as well. I have no time to investigate, but my guess is that it's related to some thread-safety issue with thread-MPI. Could one of you please file a bug report on redmine.gromacs.org? Cheers, -- Szilárd On Thu, Aug 8, 2013 at 5:52 PM, Brad Van Oosten wro

Re: [gmx-users] Re: GROMACS-CYSTEINE PROTEASES

2013-08-25 Thread MUSYOKA THOMMAS
Thanks so much Justin, I really appreciate your guidance and your input in this forum. Thanks. On Sun, Aug 25, 2013 at 6:00 PM, Justin Lemkul wrote: > > > On 8/25/13 11:35 AM, MUSYOKA THOMMAS wrote: > >> Hello, >> I am trying to minimize my ligand-protein structures but when I run the >> code,

Re: [gmx-users] Re: GROMACS-CYSTEINE PROTEASES

2013-08-25 Thread Justin Lemkul
On 8/25/13 11:35 AM, MUSYOKA THOMMAS wrote: Hello, I am trying to minimize my ligand-protein structures but when I run the code, mdrun -v deffnm em.tpr i am getting the error: Program mdrun, VERSION 4.5.5 Source code file: /build/buildd/gromacs-4.5.5/src/gmxlib/gmxfio.c, line: 519 Can not o

Re: [gmx-users] Re: GROMACS-CYSTEINE PROTEASES

2013-08-25 Thread MUSYOKA THOMMAS
Hello, I am trying to minimize my ligand-protein structures but when I run the code, mdrun -v deffnm em.tpr i am getting the error: Program mdrun, VERSION 4.5.5 Source code file: /build/buildd/gromacs-4.5.5/src/gmxlib/gmxfio.c, line: 519 Can not open file: topol.tpr For more information and tips

Re: [gmx-users] Re: GROMACS-CYSTEINE PROTEASES

2013-08-22 Thread Justin Lemkul
On 8/22/13 3:25 PM, MUSYOKA THOMMAS wrote: Dear users, I am dealing with molecular docking of ligands to cysteine proteases and i got several questions to pose. 1) For example looking at the structure of Falcipain-2 (PDBID=2OUL), i can see it got several water molecules. when and how do i deter

[gmx-users] Re: GROMACS-CYSTEINE PROTEASES

2013-08-22 Thread MUSYOKA THOMMAS
Dear users, I am dealing with molecular docking of ligands to cysteine proteases and i got several questions to pose. 1) For example looking at the structure of Falcipain-2 (PDBID=2OUL), i can see it got several water molecules. when and how do i determine whether i should stripe off the molecules.

[gmx-users] Re: Gromacs: GPU detection

2013-08-08 Thread Brad Van Oosten
I can confirm this, I have noticed this as well. -- View this message in context: http://gromacs.5086.x6.nabble.com/Gromacs-GPU-detection-tp5010308p5010399.html Sent from the GROMACS Users Forum mailing list archive at Nabble.com. -- gmx-users mailing listgmx-users@gromacs.org http://lists

[gmx-users] Re: Gromacs 4.6.3 installation Issue with Intel & CUDA

2013-08-06 Thread Brad Van Oosten
I have just successfully installed 4.6.3 using Intel compilers this morning with the following: Intel 12.1.3 cuda 5.0.35 openmpi 1.6.2 cmake 2.8.10.2 FFTW 3.3.3 I did not use gcc at all. (intel/12.1.3/icc/bin/icc for CUDA_HOST_COMPILER) Hope it helps, Brad -- View this message in context: h

Re: [gmx-users] Re: Gromacs 4.5.5

2013-06-17 Thread Emmanuel, Alaina
MACS users Subject: Re: [gmx-users] Re: Gromacs 4.5.5 Hello Maggin, I re-installed both cmake and fftw. Now I no longer have an output from "grep -i error make. log". However I still can't get past the same stage in "Making all in man 7". Sent from my Ultrafast Samsung Gala

Re: [gmx-users] Re: Gromacs 4.5.5

2013-06-17 Thread Emmanuel, Alaina
-- From: maggin Date: 17/06/2013 04:50 (GMT+00:00) To: gmx-users@gromacs.org Subject: [gmx-users] Re: Gromacs 4.5.5 Hi, Alaina Before you install GMX4.5.5, did you install cmake-2.8.8 and fftw-3.3.2 ? maggin -- View this message in context: http://gromacs.5086.x6.nabble.com/Gromacs-4-5

Re: [gmx-users] Re: Gromacs 4.5.5

2013-06-16 Thread Mark Abraham
On Jun 17, 2013 5:50 AM, "maggin" wrote: > > Hi, Alaina > > Before you install GMX4.5.5, did you install cmake-2.8.8 and fftw-3.3.2 ? Installing cmake for an autotools build isn't going to help ;-) Mark > > maggin > > > > -- > View this message in context: http://gromacs.5086.x6.nabble.com/Grom

[gmx-users] Re: Gromacs 4.5.5

2013-06-16 Thread maggin
Hi, Alaina Before you install GMX4.5.5, did you install cmake-2.8.8 and fftw-3.3.2 ? maggin -- View this message in context: http://gromacs.5086.x6.nabble.com/Gromacs-4-5-5-tp5009198p5009203.html Sent from the GROMACS Users Forum mailing list archive at Nabble.com. -- gmx-users mailing list

Re: [gmx-users] Re: gromacs 4.6.2 MPI distribution location problems

2013-06-12 Thread Mark Abraham
On Wed, Jun 12, 2013 at 12:42 AM, sirishkaushik wrote: > Note that this works fine with 4.5.5. > > when I installed using: > > ./configure CPPFLAGS="-I/home/kaushik/fftw-new/include" > LDFLAGS="-L/home/kaushik/fftw-new/lib" --enable-mpi --prefix > /home/kaushik/gromacs_executable/gromacs-old > >

[gmx-users] Re: gromacs 4.6.2 MPI distribution location problems

2013-06-11 Thread sirishkaushik
Note that this works fine with 4.5.5. when I installed using: ./configure CPPFLAGS="-I/home/kaushik/fftw-new/include" LDFLAGS="-L/home/kaushik/fftw-new/lib" --enable-mpi --prefix /home/kaushik/gromacs_executable/gromacs-old ldd mdrun on the 4.5.5 version correctly points to the fftw and the mpi

[gmx-users] Re: GROMACS 4.6.2 released

2013-05-30 Thread Mark Abraham
Sorry, the link to the release notes should be http://www.gromacs.org/About_Gromacs/Release_Notes/Versions_4.6.x#Release_notes_for_4.6.2 (Darn, I was hoping for a clean sheet, but at least this time we got to four hours before someone noticed something was wrong!) Mark On Thu, May 30, 2013 at 5

[gmx-users] Re: GROMACS 4.5.7 released

2013-04-29 Thread Mark Abraham
Hi GROMACS users, The 4.5.7 tarball at the link below has been updated to include the files necessary to build with "configure" in the usual 4.5.x way. (Thanks, Rossen!) The code itself is unchanged - you do not need to rebuild GROMACS 4.5.7 if you have already built using CMake. So that you can v

Re: [gmx-users] Re: gromacs 4.6.1 on win7?

2013-04-03 Thread Mark Abraham
Does other compiling with these compilers work? Mark On Wed, Apr 3, 2013 at 10:12 AM, 라지브간디 wrote: > Dear gmx, > > > I have tried in both (32 and 64 bit cygwin) format in my win7 -64 bit > system but both gave me the compiler gcc-broken as follows : > > > -- Check for working C compiler: /usr/b

[gmx-users] Re: gromacs 4.6.1 on win7?

2013-04-03 Thread 라지브간디
Dear gmx, I have tried in both (32 and 64 bit cygwin) format in my win7 -64 bit system but both gave me the compiler gcc-broken as follows : -- Check for working C compiler: /usr/bin/gcc-4.exe -- broken (32 bit) -- Check for working C compiler: /usr/bin/gcc.exe -- broken (64 bit) -- gmx-

[gmx-users] Re: Gromacs-4.6 installation on cygwin problem

2013-03-14 Thread neshathaq
Mr M. Thanks a lot for your help... I will contact you if I get any problem. -- View this message in context: http://gromacs.5086.n6.nabble.com/Gromacs-4-6-installation-on-cygwin-problem-tp5006297p5006319.html Sent from the GROMACS Users Forum mailing list archive at Nabble.com. -- gmx-use

[gmx-users] Re: gromacs VERSION 4.0.7-There is no domain

2013-03-10 Thread Christoph Junghans
> Date: Sun, 10 Mar 2013 03:17:24 -0700 (PDT) > From: Hamid Mosaddeghi > Subject: [gmx-users] Re: gromacs VERSION 4.0.7-There is no domain > decomposition. > To: gmx-users@gromacs.org > Message-ID: <1362910644976-5006251.p...@n6.nabble.com> > Content-Type: t

[gmx-users] Re: gromacs VERSION 4.0.7-There is no domain decomposition.....

2013-03-10 Thread Hamid Mosaddeghi
Dear Christoph thanks for quick reply >It seems like mdrun was not able to find a decomposition >automatically, try to give one by hand: >$ mdrun -dd 2 2 4 this command is general or not? >Btw, your gromacs is 2 major versions behind, it might be a good idea to update yes, but I modify so

[gmx-users] Re: gromacs VERSION 4.0.7-There is no domain decomposition.....

2013-03-09 Thread Christoph Junghans
Dear Hamid, please post these kind of questions on the users and not the developers list. 2013/3/9 Hamid Mosaddeghi : > Dear users > > I used gromacs for my system include CNT-water-ion-protein (400,000atom), I > used grompp without error. > > after used mdrun with 16 node on cluster, I get this

Re: [gmx-users] Re: Gromacs 4.6 Installation under Cygwin

2013-02-26 Thread Szilárd Páll
I've a few more comments. First of all, please upload to redmine the full mdrun.debug output! It looks like the detection of the number of cores in your machine does not work with cygwin because when you only set "-ntmp 1" N OpenMP threads should be auto-set (N=#cores), but based on your output it

Re: [gmx-users] Re: Gromacs 4.6 Installation under Cygwin

2013-02-26 Thread Szilárd Páll
Hi, That is a likely cause. I can not comment with full certainty on the use of MARTINI with the Verlet scheme & without shifts, but I know that it is a topic that's being investigated and hopefully other can comment on it. However, the hanging should definitely not happen, but instead an error s

Re: [gmx-users] Re: Gromacs 4.6 Installation under Cygwin

2013-02-26 Thread toma0052
Hello, It looks like the problem might be in my mdp settings. I am using the MARTINI force field which uses shift functions for both electrostatics and vdw which are only available in the group-based cut-off scheme. OpenMP doesn't seem to run with the group-based cut-off scheme, just Verlet

Re: [gmx-users] Re: Gromacs 4.6 Installation under Cygwin

2013-02-26 Thread Mirco Wahab
Hi Mike On 25.02.2013 17:25, toma0...@umn.edu wrote: ... Estimated maximum distance required for P-LINCS: 0.810 nm ... Domain decomposition grid 4 x 2 x 1, separate PME nodes 0 Do your runs use PME and/or P-LINCS? Maybe you could point out the problem by disabling each or both in yout .mdp fi

Re: [gmx-users] Re: Gromacs 4.6 Installation under Cygwin

2013-02-25 Thread toma0052
Hi, I have run the 3 scenarios that you mentioned. The commands and output are pasted below. Thanks, Mike ***Trial 1*** mdrun -v -deffnm Clp_Test -ntmpi 1 -ntomp 1 Reading file Clp_Test.tpr, VERSION 4.6 (single precision) Using 1 MPI thread Can not set thread affinities on the current pl

Re: [gmx-users] Re: Gromacs 4.6 Installation under Cygwin

2013-02-25 Thread Szilárd Páll
That's strange, it seems that mdrun gets stuck somewhere. This should not happen, but as we don't actively test cygwin, we can't be sure what's happening. It would be great if you could help us figure out what is going wrong. Could you try doing the following: - run with -ntmpi 1 -ntomp 1 (i.e sin

[gmx-users] Re: Gromacs 4.6 Installation under Cygwin

2013-02-25 Thread toma0052
Hello, Thanks for the help. After setting the library path properly, I seem to be able to get gromacs up and running. However, I have run into another problem with mdrun and actually running any jobs. When I execute mdrun -v -deffnm Clp_Test -nt The output is: Reading file Clp_Test.tpr, V

[gmx-users] Re: gromacs-4.6.tar.gz installation question

2013-02-05 Thread Christoph Junghans
> Date: Mon, 04 Feb 2013 15:32:17 -0500 > From: Justin Lemkul > Subject: Re: [gmx-users] gromacs-4.6.tar.gz installation question > To: Discussion list for GROMACS users > Message-ID: <51101ad1.3000...@vt.edu> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > > > On 2/4/13 3:30 PM

[gmx-users] Re: gromacs-4.6.tar.gz installation question

2013-02-04 Thread jeela keel
Thank you for your respond but I did not get the respond to my email, I found the answer on-line. I don't know why I am not receiving any email back from the mailing list. Thank you Jeela On Mon, Feb 4, 2013 at 12:30 PM, jeela keel wrote: > *Dear All, > > I am trying to install gromacs, I dowl

[gmx-users] Re: gromacs 4.6 GB/SA problem and poor performance

2013-01-27 Thread Changwon Yang
Using ICC 13.0, I got a same result. -- View this message in context: http://gromacs.5086.n6.nabble.com/gromacs-4-6-GB-SA-problem-and-poor-performance-tp5004729p5004922.html Sent from the GROMACS Users Forum mailing list archive at Nabble.com. -- gmx-users mailing listgmx-users@gromacs.or

[gmx-users] Re: gromacs 4.6 GB/SA problem and poor performance

2013-01-21 Thread Changwon Yang
Input files: conf.gro and mdp files http://www.gromacs.org/Documentation/Installation_Instructions_4.5/GROMACS-OpenMM cpu-imp-RF-inf.mdp constraints = all-bonds integrator = md dt = 0.002; ps ! nsteps = 0 nstlist = 0 ns_type

Re: [gmx-users] Re: gromacs 4.6 segfault

2013-01-15 Thread Dr. Vitaly Chaban
On Tue, Jan 15, 2013 at 1:09 PM, Justin Lemkul wrote: > > > On 1/15/13 7:06 AM, Dr. Vitaly Chaban wrote: >>> >>> using mdrun (version 4.6-beta3) on a GPU node (1 nvidia K10 with cuda >>> drivers and runtime 4.2 + 2 times intel 6 core E5 with hyper threading >>> and SSE4.1) I get allways after a fe

Re: [gmx-users] Re: gromacs 4.6 segfault

2013-01-15 Thread Justin Lemkul
On 1/15/13 7:16 AM, Dr. Vitaly Chaban wrote: On Tue, Jan 15, 2013 at 1:09 PM, Justin Lemkul wrote: On 1/15/13 7:06 AM, Dr. Vitaly Chaban wrote: using mdrun (version 4.6-beta3) on a GPU node (1 nvidia K10 with cuda drivers and runtime 4.2 + 2 times intel 6 core E5 with hyper threading and

Re: [gmx-users] Re: gromacs 4.6 segfault

2013-01-15 Thread Dr. Vitaly Chaban
On Tue, Jan 15, 2013 at 1:09 PM, Justin Lemkul wrote: > > > On 1/15/13 7:06 AM, Dr. Vitaly Chaban wrote: >>> >>> using mdrun (version 4.6-beta3) on a GPU node (1 nvidia K10 with cuda >>> drivers and runtime 4.2 + 2 times intel 6 core E5 with hyper threading >>> and SSE4.1) I get allways after a fe

Re: [gmx-users] Re: gromacs 4.6 segfault

2013-01-15 Thread sebastian
On 01/15/2013 01:09 PM, Justin Lemkul wrote: On 1/15/13 7:06 AM, Dr. Vitaly Chaban wrote: using mdrun (version 4.6-beta3) on a GPU node (1 nvidia K10 with cuda drivers and runtime 4.2 + 2 times intel 6 core E5 with hyper threading and SSE4.1) I get allways after a few or few 100 ns the followi

Re: [gmx-users] Re: gromacs 4.6 segfault

2013-01-15 Thread Justin Lemkul
On 1/15/13 7:06 AM, Dr. Vitaly Chaban wrote: using mdrun (version 4.6-beta3) on a GPU node (1 nvidia K10 with cuda drivers and runtime 4.2 + 2 times intel 6 core E5 with hyper threading and SSE4.1) I get allways after a few or few 100 ns the following segfault: line 15: 28957 Segmentation faul

[gmx-users] RE: Gromacs 2 CHARMM

2012-10-02 Thread lloyd riggs
Dear All, Does anyone have a small script for converting Gromacs (GROMOS type) ff to CHARMM format, or an amino acid top file in CHARMM format for such. I have seen some scripts, but they work only with different topology types. Thought I would ask, otherwise I sit here for three days playin

Re: [gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Thomas Piggot
Hi, Yes, I agree with you regarding the combination of Berger and GROMOS force field and requiring validation. I just wanted to point out the interactions between the protein and lipid are treated in the same way, irrespective of the different GROMOS protein force field used (when using the p

Re: [gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Justin Lemkul
On 7/18/12 6:57 PM, Thomas Piggot wrote: Hi, Justin, I am interested by your comments regarding the CHARMM lipids. In particular can you elaborate as to why you think that the CHARMM lipids are better than the united-atom ones (such as Berger and several GROMOS variants). I think there's no

Re: [gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Thomas Piggot
Hi, Justin, I am interested by your comments regarding the CHARMM lipids. In particular can you elaborate as to why you think that the CHARMM lipids are better than the united-atom ones (such as Berger and several GROMOS variants). As for the original question, the modifications in going fro

[gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Rajat Desikan
I got the answer to whether we can implement CHARMM36 into gromacs...:) thanks http://www.gromacs.org/Downloads/User_contributions/Force_fields I still want your opinion on whether it is the best ff for simulating a membrane-protein system, and if any modifications to the ff are necessary? Thanks

[gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Rajat Desikan
So CHARMM36 would be the best ff for a long membrane protein simulation? Is it possible to integrate CHARMM36 into Gromacs? -- View this message in context: http://gromacs.5086.n6.nabble.com/Gromacs-54a7-force-field-tp4999538p4999544.html Sent from the GROMACS Users Forum mailing list archive at

Re: [gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Justin Lemkul
On 7/18/12 6:13 PM, Rajat Desikan wrote: Thanks for the quick and detailed replies Justin :) This helped clear some doubts I had. I thought all Charmm ff were compatible in Gromacs? Which Charmm ff were you referring to? CHARMM force fields are largely just sequential additions and refinemen

[gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Rajat Desikan
Thanks for the quick and detailed replies Justin :) This helped clear some doubts I had. I thought all Charmm ff were compatible in Gromacs? Which Charmm ff were you referring to? -- View this message in context: http://gromacs.5086.n6.nabble.com/Gromacs-54a7-force-field-tp4999538p4999542.html Se

Re: [gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Justin Lemkul
On 7/18/12 5:51 PM, Rajat Desikan wrote: "54A7 also introduced changes to the Gromos96 lipid parameters" How will this change my inclusion of the berger lipid parameters? Any thing that I should pay special attention to? Are there other lipid parameters more compatible? There are better force

[gmx-users] Re: Gromacs 54a7 force field

2012-07-18 Thread Rajat Desikan
"54A7 also introduced changes to the Gromos96 lipid parameters" How will this change my inclusion of the berger lipid parameters? Any thing that I should pay special attention to? Are there other lipid parameters more compatible? I heard from a faculty member at our Institute that the 53a6 is a bad

[gmx-users] Re: Gromacs-Orca QMMM LJ coefficients problem

2012-07-17 Thread Minos Matsoukas
Well, the problem was solved at the end, please don't bother answering. Following the advise of one of the Orca developers, Christoph, I changed QMMMscheme to normal instead of ONIOM and disabled periodic boundary conditions. Now the files are extracted normally. for further reference, one sh

[gmx-users] Re: Gromacs

2012-05-25 Thread Dr. Vitaly V. Chaban
Dear Ahmed - I do not understand how you imagine "FCC geometry" in the liquid state of matter. If you want to just resize my system, use the standard "genbox" utility and then re-equilibrate at the desired temperature and density (if you want to fix density, of course). Dr. Vitaly V. Chaban, 43

[gmx-users] Re: Gromacs files to simulate SiO2 film

2012-05-23 Thread jrustad
Alexey I have a quartz(101) surface and a gromacs BKS input file that might get you started at least. Cheers Jim -- View this message in context: http://gromacs.5086.n6.nabble.com/Gromacs-files-to-simulate-SiO2-film-tp4997666p4997673.html Sent from the GROMACS Users Forum mailing list archive at

Re: [gmx-users] Re: Gromacs files to simulate SiO2 film

2012-05-23 Thread Alexey Lyulin
y 23, 2012 12:12 PM Subject: [gmx-users] Re: Gromacs files to simulate SiO2 film Dear Alexey, The answer to this question partly depends on your application- for example, is it important for the surface of the amorphous SiO2 to be able to undergo reactions like Si-O-Si + H2O = 2SiOH? Regards,

[gmx-users] Re: Gromacs files to simulate SiO2 film

2012-05-23 Thread jrustad
Dear Alexey, The answer to this question partly depends on your application- for example, is it important for the surface of the amorphous SiO2 to be able to undergo reactions like Si-O-Si + H2O = 2SiOH? Regards, Jim Rustad -- View this message in context: http://gromacs.5086.n6.nabble.com/Grom

[gmx-users] Re: Gromacs on "HPC workstations" ?

2012-04-18 Thread Nicola Fantini
Dear Jonathan, as there is a growing number of cloud providers offering HPC resources, cloud solutions are becoming certainly an attractive alternative compared to the purchase of your own equipment. The prime criteria for choosing either option should be your expected distribution of workload ov

Re: [gmx-users] Re: GROMACS (w. OpenMPI) fails to run with -np larger than 10

2012-04-11 Thread Szilárd Páll
On Wed, Apr 11, 2012 at 6:16 PM, Mark Abraham wrote: > On 12/04/2012 1:42 AM, haadah wrote: >> >> Could you clarify what you mean by "Sounds like an MPI configuration >> program. >> I'd get a test program running on 18 cores before worrying about anything >> else."? My problem is that i can't get

Re: [gmx-users] Re: GROMACS (w. OpenMPI) fails to run with -np larger than 10

2012-04-11 Thread Mark Abraham
On 12/04/2012 1:42 AM, haadah wrote: Could you clarify what you mean by "Sounds like an MPI configuration program. I'd get a test program running on 18 cores before worrying about anything else."? My problem is that i can't get anything to work with -np set to more than 10. Only you can configu

[gmx-users] Re: GROMACS (w. OpenMPI) fails to run with -np larger than 10

2012-04-11 Thread haadah
Could you clarify what you mean by "Sounds like an MPI configuration program. I'd get a test program running on 18 cores before worrying about anything else."? My problem is that i can't get anything to work with -np set to more than 10. The “Cannot rename checkpoint file; maybe you are out of qu

[gmx-users] Re: Gromacs-CPMD: QMMM

2012-04-03 Thread Jacob Jantzi
All, In reply to the following message, I have been receiving the same error: QM-Box0.0 40.00.0 40.00.0 40.0 Bounding-B0.0 22.563330.0 24.774310.0 15.70362 Step-No0 LMAX-OF-MMQ-EXP0 INTML-UPD-FREQ 1 OU

Re: [gmx-users] Re: Gromacs-GPU benchmark test killed after exhausting the memory

2012-03-05 Thread Szilárd Páll
Hi Efrat, It indeed looks like a memory leak. Could you please file a bug on redmine.gromacs.org? Cheers, -- Szilárd On Sun, Mar 4, 2012 at 12:21 PM, Efrat Exlrod wrote: > Hi Szilard, > > Thanks for your reply. > I used your script and I think it does look as a memory leak. Please look at >

[gmx-users] Re: Gromacs-GPU benchmark test killed after exhausting the memory

2012-03-04 Thread Efrat Exlrod
Hi Szilard, Thanks for your reply. I used your script and I think it does look as a memory leak. Please look at the attached runchkmem.out Is it possible this problem exists in version 4.5.5 and was solved in version 4.6 you are using? When will version 4.6 be released? Thanks, Efrat >Messag

Re: [gmx-users] Re: Gromacs analysis tools for Namd output

2012-02-08 Thread Ignacio Fernández Galván
--- On Tue, 7/2/12, Kunze, Micha wrote: > cant help you with the plugin issue, but have you instead tried loading the dcd trajectory into> vmd and save it as trr? No need to open it with VMD. You can use catdcd () to do the conversion: catdcd

Re: [gmx-users] Re: Gromacs analysis tools for Namd output

2012-02-07 Thread Francesco Oteri
Hu Paul, it is possible the dl library (containing the function to dinamically loading libraries) is missed. To confirm this hypotesis, try: ldd path_g_rmsf On my PC this is the output: linux-vdso.so.1 => (0x7fff6000) libgmxana.so.6 => /apps/gromacs/4.5.5/gnu/lib/libgm

Re: [gmx-users] Re: Gromacs analysis tools for Namd output

2012-02-07 Thread Kunze, Micha
Hey Paul, cant help you with the plugin issue, but have you instead tried loading the dcd trajectory into vmd and save it as trr? Cheers, Micha On 7 Feb 2012, at 18:18, "PAUL NEWMAN" mailto:paulcliz...@gmail.com>> wrote: Dear Gromacs users, This is my second email since I tried the previous

[gmx-users] Re: Gromacs analysis tools for Namd output

2012-02-07 Thread PAUL NEWMAN
Dear Gromacs users, This is my second email since I tried the previous advises and did NOT work. I want to use the Gromacs analysis tools for analyzing Namd output files (*.dcd files) I installed Gromacs 4.5.4 (64 bits) and it works well. In addition I installed VMD 1.9 (64 bits) and set up VMD_P

Re: [gmx-users] Re: Gromacs on GPU

2012-01-29 Thread Justin A. Lemkul
Benjamin Hall wrote: Justin A. Lemkul wrote: Ours is gcc-4.4.6 on a local machine using Cuda 3.2. On our university's GPU cluster, the installation was done with gcc-4.3.4 and Cuda 3.1. I have not tested Cuda 4.0, though it has recently become available to us so it might be worth a shot

[gmx-users] Re: Gromacs on GPU

2012-01-29 Thread Benjamin Hall
Justin A. Lemkul wrote: > Ours is gcc-4.4.6 on a local machine using Cuda 3.2. On our university's GPU > cluster, the installation was done with gcc-4.3.4 and Cuda 3.1. I have not > tested Cuda 4.0, though it has recently become available to us so it might be > worth a shot to see if this is a

Re: [gmx-users] Re: Gromacs 4.5.4 on multi-node cluster

2011-12-08 Thread Nikos Papadimitriou
> > Forwarded Message > > From: Nikos Papadimitriou > > To: gmx-users@gromacs.org > > Subject: [gmx-users] Re: Gromacs 4.5.4 on multi-node cluster > > Date: Thu, 8 Dec 2011 11:44:36 +0200 > > > > > > Forwarded Me

Re: [gmx-users] Re: Gromacs 4.5.4 on multi-node cluster

2011-12-08 Thread Dimitris Dellis
Hi. This is openmpi related. Probably you have active the virbr0 interface with IP 192.168.122.1 on nodes. Stop and disable the libvirtd (and probably libvirt-guests) service if you don't need it. Alternatively, 1. add --mca btl_tcp_if_exclude lo,virbr0 in mpirun flags or 2. add in /home/grom

[gmx-users] Re: Gromacs 4.5.4 on multi-node cluster

2011-12-08 Thread Nikos Papadimitriou
> > email message attachment > > > Forwarded Message > > From: Nikos Papadimitriou > > To: gmx-users@gromacs.org > > Subject: [gmx-users] Gromacs 4.5.4 on multi-node cluster > > Date: Wed, 7 Dec 2011 16:26:46 +0200 > > > > Dear All, > > > > I had been running Gromacs 4.0.7

[gmx-users] RE: gromacs installation

2011-10-11 Thread Li, Hualin
Oh, I did installation on supercomputer before. Because you are not root user, in the step "./configure", maybe you should use "./configure --prefix=/home/yourname/..." to change the default installation directory. Just my two cents. Hope it helps. --hualin _

[gmx-users] Re: GROMACS @ Facebook

2011-09-21 Thread Dr. Vitaly V. Chaban
Cool... :-) > Hi, > > for your entertainment and as a reach-out to younger scientists GROMACS > is now on Facebook. Please look us up at: > > http://www.facebook.com/pages/GROMACS/257453660934850 > > We're looking forward to your comments. > > Cheers, > -- > David van der Spoel, Ph.D., Professor o

[gmx-users] Re: gromacs doubts

2011-09-07 Thread Justin A. Lemkul
Please keep all Gromacs-related correspondence on the gmx-users list. I am not a private tutor. karthick wrote: sir iam karthick using gromacs for academic purpose.i had run protein-ligand complex based on your lyzozyme tutorial.further i want to about DGbind value..i heard g_lie comm

[gmx-users] Re: gromacs question topologie

2011-08-29 Thread Justin A. Lemkul
Please keep all Gromacs-related correspondence on the gmx-users list, particularly if the discussion was previously carried out there. I am not a private tutor. Joschua Sterzenbach wrote: Hi is in the coordinate file only the geometry of the molecule? Yes. Have a look at its contents -

[gmx-users] RE: Gromacs on GPU: GTX or Tesla?

2011-08-07 Thread Efrat Exlrod
Date: Thu, 4 Aug 2011 21:41:29 +0200 From: Szil?rd P?ll Subject: Re: [gmx-users] RE: Gromacs on GPU: GTX or Tesla? To: Discussion list for GROMACS users Message-ID: Content-Type: text/plain; charset=ISO-8859-1 Hi, Tesla cards won't give you much benefit when it comes to runnin

Re: [gmx-users] RE: Gromacs on GPU: GTX or Tesla?

2011-08-04 Thread Szilárd Páll
Hi, Tesla cards won't give you much benefit when it comes to running the current Gromacs. Additionally, I can tell you so much that this won't change in the future either. The only advantage of the C20x0-s is ECC and double precision - which is ATM anyway not supported in Gromacs on GPUs. Gromacs

[gmx-users] RE: Gromacs on GPU: GTX or Tesla?

2011-08-04 Thread Jagdish S. Varma
Hi, In the long term the Tesla is much more preferable as the GTX would not support certain high end functions. Pls verify these before you purchase one. Also the Tesla C-2070 is the newer one available and with 6.0GB RAM it really gives out great performance. The price is the same. Rgds JSV __

[gmx-users] Re: Gromacs Query.

2011-06-23 Thread Justin A. Lemkul
Please keep all Gromacs-related correspondence on the gmx-users list. I am not a private help service. I am CC'ing this message to the list and would ask that all further discussion take place there. Your topology specifies dihedrals that do not exist under the desired force field (whateve

Re: [gmx-users] Re: Gromacs error

2011-06-17 Thread Mark Abraham
On 06/17/2011 06:36 PM, bharat gupta wrote: I took all the parameters of Ptyr from amber parameter database and followed what was said in the documentation. The fact that you seem to change the capitalization of "PTYR" every time you type it does not inspire confidence that you've taken due ca

Re: [gmx-users] Re: Gromacs error

2011-06-17 Thread bharat gupta
I took all the parameters of Ptyr from amber parameter database and followed what was said in the documentation. Shall mail the changes made for topology On Fri, Jun 17, 2011 at 5:17 PM, Mark Abraham wrote: > On 06/17/2011 12:18 PM, bharat gupta wrote: > >> thanks.. I fixed the problem by using

Re: [gmx-users] Re: Gromacs error

2011-06-17 Thread Mark Abraham
On 06/17/2011 12:18 PM, bharat gupta wrote: thanks.. I fixed the problem by using the command execstack -c filename but I have another issue .. I am am preparing the structure for simulation which is a docked complex containing Phospharylated tyrosine. I am using Amber 99 force field and upd

Fwd: [gmx-users] Re: Gromacs error

2011-06-17 Thread bharat gupta
7; not found in residue topology database" ... but all the changes I have made then where else could the problem be ?? -- Forwarded message -- From: bharat gupta Date: Fri, Jun 17, 2011 at 11:18 AM Subject: Re: [gmx-users] Re: Gromacs error To: Discussion list for GROMACS users

Re: [gmx-users] Re: Gromacs error

2011-06-16 Thread bharat gupta
- > When the only tool you own is a hammer, every problem begins to resemble a > nail. > > ** ** > > *From:* gmx-users-boun...@gromacs.org [mailto: > gmx-users-boun...@gromacs.org] *On Behalf Of *bharat gupta > *Sent:* Friday, 17 June 2011 10:40 AM > *To:* Discussi

RE: [gmx-users] Re: Gromacs error

2011-06-16 Thread Dallas Warren
nt: Friday, 17 June 2011 10:40 AM To: Discussion list for GROMACS users Subject: [gmx-users] Re: Gromacs error Hi, I have installed gromacs 4.5 on fedora core 15 and whenever I try to run any command like pdb2gmx ... I am getting the following error : error while loading shared libraries: l

[gmx-users] Re: Gromacs error

2011-06-16 Thread bharat gupta
Hi, I have installed gromacs 4.5 on fedora core 15 and whenever I try to run any command like pdb2gmx ... I am getting the following error : error while loading shared libraries: libgmx.so.6: cannot enable executable stack as shared object requires: Permission denied Pls help?? -- Bharat --

Re: [gmx-users] Re: Gromacs Installation Error on Powerbook G4 Running OS 10.5.8

2011-05-24 Thread Justin A. Lemkul
Matthew Bick wrote: On May 23, 2011, at 4:24 PM, gmx-users-requ...@gromacs.org wrote: Re: Gromacs Installation Error on Powerbook G$ RunningOS 10.5.8 Hi Justin. Thanks for you response. See my responses below, embedded in the original m

[gmx-users] Re: Gromacs Installation Error on Powerbook G4 Running OS 10.5.8

2011-05-24 Thread Matthew Bick
On May 23, 2011, at 4:24 PM, gmx-users-requ...@gromacs.org wrote: Re: Gromacs Installation Error on Powerbook G$ RunningOS 10.5.8 Hi Justin. Thanks for you response. See my responses below, embedded in the original message: Matthew Bick wrote: Dear Gromacs community. I am

Re: [gmx-users] Re: gromacs QM/MM compilation with gaussian (Txema Mercero)

2011-02-17 Thread Txema Mercero
377000 r-xp 00:17 6783121 >>  /opt/intel/Compiler/11.1/073/lib/intel64/libintlc.so.5 >> 2b0e5d377000-2b0e5d476000 ---p 0003b000 00:17 6783121 >>  /opt/intel/Compiler/11.1/073/lib/intel64/libintlc.so.5 >> 2b0e5d476000-2b0e5d479000 rw-p 0003a000 00:17 6783121 >>  /opt/intel/Compiler/11.1/073/lib/intel64/libintlc.so.5 >> 2b0e5d479000-2b0e5d8b9000 rw-p 2b0e5

  1   2   >