Re: mdrun on 8-core AMD + GTX TITAN (was: Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs)

2013-11-12 Thread Dwey Kauffman
Hi Mark and Szilard Thanks for your both suggestions. They are very helpful. > > Neither run had a PP-PME work distribution suitable for the hardware it > was > running on (and fixing that for each run requires opposite changes). > Adding > a GPU and hoping to see scaling requires that there

Re: mdrun on 8-core AMD + GTX TITAN (was: Re: [gmx-users] Re: Gromacs-4.6 on two Titans GPUs)

2013-11-09 Thread Dwey Kauffman
worse. The more nodes are involved in a task, the performance is worse. >> in my >>experience even reaction field runs don't scale across nodes with 10G >>ethernet if you have more than 4-6 ranks per node trying to >>communicate (let alone with PME). What dose it mea

[gmx-users] Re: Hardware for best gromacs performance?

2013-11-05 Thread Dwey Kauffman
Hi Szilard, Thanks. >From Timo's benchmark, 1 node142 ns/day 2 nodes FDR14 218 ns/day 4 nodes FDR14 257 ns/day 8 nodes FDR14 326 ns/day It looks like a infiniband network is "required" in order to scale up when running a task across nodes. Is it correct ?

[gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread Dwey Kauffman
e when we run a task across nodes ? in other words, what dose mudrun_mpi look like ? Thanks, Dwey -- View this message in context: http://gromacs.5086.x6.nabble.com/Gromacs-4-6-on-two-Titans-GPUs-tp5012186p5012279.html Sent from the GROMACS Users Forum mailing list archive at Nabble.com. -- g

[gmx-users] Re: Hardware for best gromacs performance?

2013-11-05 Thread Dwey Kauffman
Hi Timo, Can you provide a benchmark with "1" Xeon E5-2680 with "1" Nvidia k20x GPGPU on the same test of 29420 atoms ? Are these two GPU cards (within the same node) connected by a SLI (Scalable Link Interface) ? Thanks, Dwey -- View this message in context: ht

[gmx-users] Re: Gromacs-4.6 on two Titans GPUs

2013-11-05 Thread Dwey
, will a infiniband networking help increase a final performance when we execute a mpi task ? or what else ? or forget about mpi and use single GPU instead. Any suggestion is highly appreciated. Thanks. Dwey > Date: Tue, 5 Nov 2013 16:20:39 +0100 > From: Mark Abraham > Subject: Re: [

[gmx-users] RE: average pressure of a system

2013-09-12 Thread Dwey Kauffman
emains unchanged if pressure coupling is removed in production MD. However, can it be justified in a system of membrane protein ? because the purpose of pressure coupling is to stabilize the pressure and density. For example, for 10 ns simulation, the average pressure of this system is -5.55 bar, which is less convincing. Energy Avera

[gmx-users] RE: average pressure of a system

2013-09-11 Thread Dwey Kauffman
Justin Lemkul wrote > On 9/11/13 12:12 AM, Dwey Kauffman wrote: >>> True, but thermostats allow temperatures to oscillate on the order of a >>> few >> K, >>> and that doesn't happen on the macroscopic level either. Hence the >>> small >>&

[gmx-users] RE: average pressure of a system

2013-09-10 Thread Dwey Kauffman
of targeted quantities for comparison. > >You could try altering tau_p, but I doubt there is any value in doing so. I would give it a try. Thanks for the hint. Dwey www interface or send it to gmx-users-request@. * Can't post? Read http://www.gromacs.org/Support/Mailing_List

[gmx-users] RE: average pressure of a system

2013-09-10 Thread Dwey Kauffman
1.69086 0.58162.8793.35668 (bar) Running longer simulations seems to me that the improvement of system pressure is not helpful too much. If I need to modify mdp file, what it would be ? Many thanks, Dwey My mdp file for NPT is used in the simulation like define

[gmx-users] average pressure of a system

2013-09-10 Thread Dwey
bar, although average energy, average temperature (323 K), average density (1022 kg/m^3) are already at the desired values ? How should I do to stabilize average pressure at a desired value (~1 bar) ? Thanks for any input. Dwey -- gmx-users mailing listgmx-users@gromacs.org http

[gmx-users] Re: GPU version of Gromacs

2013-08-19 Thread Dwey Kauffman
one GPU in your box. hope this helps. Dwey -- View this message in context: http://gromacs.5086.x6.nabble.com/GPU-version-of-Gromacs-tp5010581p5010606.html Sent from the GROMACS Users Forum mailing list archive at Nabble.com. -- gmx-users mailing listgmx-users@gromacs.org http

[gmx-users] GPU / CPU load imblance

2013-06-25 Thread Dwey
CPU. I appreciate kind advice and hints to improve this mdp file. Thanks, Dwey ### courtesy to Justin # title = Umbrella pulling simulation define = -DPOSRES_B ; Run parameters integrator = md dt = 0.002 tinit = 0 nsteps = 500 ; 10 ns nstc

[gmx-users] Re: free energy calculations of methane in water computed by GMX ver 4.5.7 and ver 4.6.2

2013-06-22 Thread Dwey
19 kJmol-1) remains incorrect. Thanks, Dwey ++ g_bar ver 4.5.7, md0.05.xvg: 0.0 - 5000.0; lambda = 0.050 foreign lambdas: 0.050 (250001 pts) 0.000 (250001 pts) 0.100 (250001 pts) md0.15.xvg: 0.0 - 5000.0; lambda = 0.150 foreign lambdas: 0.1

[gmx-users] free energy calculations of methane in water computed by GMX ver 4.5.7 and ver 4.6.2

2013-06-21 Thread Dwey
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin/gmx-tutorials/free_energy/Files/em_l-bfgs.mdp Again, I appreciate advice or a hint. Thanks, Dwey -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromac