Hi Mark,

Thanks for your reply.
If I open the .tpr file using notepad, it seems to be a binary file.
Then, how to remove the  the bonded terms and zero the VDW parameters?

I really need to compare how fast different well known package can
compute GB-polarization energy and how good the energy values are?
That's why time is an important factor me my experiments and I  really
want to measure the time for GB energy in isolation !

Thanks,
Jesmin
>
> On Thu, Aug 16, 2012 at 2:44 AM, Mark Abraham <mark.abra...@anu.edu.au> wrote:
>>
>> On 16/08/2012 4:26 PM, jesmin jahan wrote:
>>>
>>> Hi Mark,
>>>
>>> Thanks for your previous reply.
>>> I tried to run single point energy simulation with some proteins.
>>> I got .log files with content like this:
>>>
>>> Energies (kJ/mol)
>>>             Bond          Angle    Proper Dih.  Improper Dih.GB Polarization
>>>      1.54109e+04    3.84351e+03    8.47152e+03    3.58425e+02   -1.69666e+04
>>>            LJ-14     Coulomb-14        LJ (SR)   Coulomb (SR)      Potential
>>>      4.29664e+03    3.63997e+04    2.22900e+05   -5.18818e+04    2.22832e+05
>>>      Kinetic En.   Total Energy    Temperature Pressure (bar)
>>>      1.08443e+09    1.08465e+09    2.73602e+07    0.00000e+00
>>> .......
>>>
>>> Computing:                               M-Number         M-Flops  % Flops
>>> -----------------------------------------------------------------------------
>>>   Generalized Born Coulomb                 0.005711           0.274     0.2
>>>   GB Coulomb + LJ                          0.416308          25.395    18.5
>>>   Outer nonbonded loop                     0.016367           0.164     0.1
>>>   1,4 nonbonded interactions               0.008410           0.757     0.6
>>>   Born radii (HCT/OBC)                     0.439486          80.426    58.5
>>>   Born force chain rule                    0.439486           6.592     4.8
>>>   NS-Pairs                                 0.943653          19.817    14.4
>>>   Reset In Box                             0.003179           0.010     0.0
>>>   CG-CoM                                   0.006358           0.019     0.0
>>>   Bonds                                    0.003219           0.190     0.1
>>>   Angles                                   0.005838           0.981     0.7
>>>   Propers                                  0.011273           2.582     1.9
>>>   Virial                                   0.003899           0.070     0.1
>>>   Stop-CM                                  0.003179           0.032     0.0
>>>   Calc-Ekin                                0.006358           0.172     0.1
>>> -----------------------------------------------------------------------------
>>>   Total                                                     137.479   100.0
>>> -----------------------------------------------------------------------------
>>>
>>>
>>>      D O M A I N   D E C O M P O S I T I O N   S T A T I S T I C S
>>>
>>>   av. #atoms communicated per step for force:  2 x 6859.0
>>>
>>>
>>>       R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G
>>>
>>>   Computing:         Nodes     Number     G-Cycles    Seconds     %
>>> -----------------------------------------------------------------------
>>>   Domain decomp.        16          1        0.043        0.0     1.4
>>>   Comm. coord.          16          1        0.003        0.0     0.1
>>>   Neighbor search       16          1        0.103        0.0     3.5
>>>   Force                 16          1        1.530        0.5    51.5
>>>   Wait + Comm. F        16          1        0.264        0.1     8.9
>>>   Write traj.           16          1        0.062        0.0     2.1
>>>   Update                16          1        0.001        0.0     0.0
>>>   Comm. energies        16          2        0.933        0.3    31.4
>>>   Rest                  16                   0.031        0.0     1.1
>>> -----------------------------------------------------------------------
>>>   Total                 16                   2.970        0.9   100.0
>>> -----------------------------------------------------------------------
>>>
>>> NOTE: 31 % of the run time was spent communicating energies,
>>>        you might want to use the -gcom option of mdrun
>>>
>>>
>>>         Parallel run - timing based on wallclock.
>>>
>>>                 NODE (s)   Real (s)      (%)
>>>         Time:      0.056      0.056    100.0
>>>                 (Mnbf/s)   (GFlops)   (ns/day)  (hour/ns)
>>> Performance:      7.497      2.442      1.535     15.637
>>>
>>>
>>> >From the log file, it seems, the time includes the time for LJ and
>>> Columb Potential Energy. But as I said before, I am only interested to
>>> GB-energy times. I am doing a comparative study of GB-energy
>>> performance (values vs time) for different molecular dynamic packages.
>>
>>
>> Since the LJ calculation also needs the distances, GROMACS does them in the 
>> same loops and makes no apology for being efficient. :-) If you're really 
>> trying to measure the time for the GB energy in isolation, then you will 
>> need to construct a different model physics that lacks LJ interactions. Or 
>> perhaps you don't really want to measure the time for GB energy in 
>> isolation. Depends what you're planning on using the information for, but 
>> usually measuring a time representative of the calculation you plan to run 
>> later is a good way to avoid having to account for lots of subtleties of 
>> different packages.
>>
>>
>>> That's why I was trying to deduct the time for any other extra energy
>>> computation time from it.
>>>
>>> Can anyone tell me how to get the exact time of GB-polarization energy
>>> (including Born radii) and excluding the times for any other
>>> additional energy (like LJ and Columb etc) from gromacs simutation?
>>
>>
>> The .tpr you use for the rerun doesn't have to be one that will produce a 
>> sensible model physics. If you remove the bonded terms and zero the VDW 
>> parameters then the only thing left to compute is the electrostatics, which 
>> will give you the time you seek. You'll still potentially have time spent 
>> doing neighbour searching, and that is something you need to consider for 
>> gauging relative performance of different packages. Again, the times you 
>> measure will not be significant unless you run for at least several minutes.
>>
>> Mark
>>
>>
>>>
>>>
>>> Thanks,
>>> Jesmin
>>>
>>>
>>>
>>> On Tue, Aug 14, 2012 at 10:16 AM, jesmin jahan <shraba...@gmail.com> wrote:
>>>>
>>>> Thanks Mark for your reply. I was trying to use Single-Point Energy
>>>> Calculation as you advised in your first reply but for most of the
>>>> files the simulation failed because I was using the original .pdb
>>>> files in the mdrun command.
>>>>
>>>> Anyways. I really appreciate your help.
>>>> Thanks again,
>>>> Jesmin
>>>>
>>>> On Tue, Aug 14, 2012 at 1:26 AM, Mark Abraham <mark.abra...@anu.edu.au> 
>>>> wrote:
>>>>>
>>>>> On 14/08/2012 7:38 AM, jesmin jahan wrote:
>>>>>>
>>>>>> Dear Gromacs Users,
>>>>>>
>>>>>> I have some questions regarding GB-Polarization Energy Calculation
>>>>>> with Gromacs. I will be grateful if someone can help me with the
>>>>>> answers.
>>>>>>
>>>>>> I am trying to calculate GB-Polarization energy for different Protein
>>>>>> molecules. I am interested both in energy values with the time
>>>>>> required to calculate the Born Radii and Polarization Energy.
>>>>>> I am not doing any energy minimization step as the files I am using as
>>>>>> input are already minimized.
>>>>>>
>>>>>> Here is the content of my  mdrun.mdp file:
>>>>>>
>>>>>> constraints         =  none
>>>>>> integrator            =  md
>>>>>> pbc                       =  no
>>>>>> dt                         =  0.001
>>>>>> nsteps                 =  0
>>>>>> implicit_solvent    =  GBSA
>>>>>> gb_algorithm        =  HCT
>>>>>> sa_algorithm        =  None
>>>>>>
>>>>>> And I am using following three steps for all the .pdb files I have:
>>>>>>
>>>>>> let x is the name of the .pdb file.
>>>>>>
>>>>>> pdb2gmx -f x.pdb -ter -ignh -ff amber99sb -water none
>>>>>> grompp -f mdr.mdp -c conf.gro -p topol.top -o imd.tpr
>>>>>> mpirun -np 8 mdrun_mpi  -deffnm imd -v -g x.log
>>>>>
>>>>>
>>>>> So you're not using the advice I gave you about how to calculate single
>>>>> point energies. OK.
>>>>>
>>>>>
>>>>>> 1 .Now the running time reported by a log file also includes other
>>>>>> times. Its also not clear to me whether the time includes the time for
>>>>>> Born Radii calculations.
>>>>>
>>>>>
>>>>> The timing breakdown is printed at the end of the .log file. Likely your
>>>>> time is heavily dominated by the GB calculation and communication cost. 
>>>>> Born
>>>>> radii calculation are part of the former, and not reported separately. You
>>>>> should not bother with timing measurements unless your run goes for at 
>>>>> least
>>>>> several minutes, else your time will be dominated by I/O and setup costs.
>>>>>
>>>>>
>>>>>> So, to get the GB-energy time  I am doing the following: I am also
>>>>>> running a simulation with "implicit_solvent" set to "no" and I am
>>>>>> taking the difference of these two (with GB and Without GB). Is that a
>>>>>> right approach?
>>>>>
>>>>>
>>>>> No, that measures the weight difference between an apple and an orange, 
>>>>> not
>>>>> whether the apple's seeds are heavy.
>>>>>
>>>>>
>>>>>> I also want to be sure that it also includes Born-Radii calculation time.
>>>>>
>>>>>
>>>>> It's part of the GB calculation, so it's included in its timing.
>>>>>
>>>>>
>>>>>> Is there any other approach to do this?
>>>>>>
>>>>>>
>>>>>> 2. I was trying to run the simulations on 192 cores (16 nodes each
>>>>>> with 12 codes). But I got "There is no domain decomposition for 12
>>>>>> nodes that is compatible with the given box and a minimum cell size of
>>>>>> 2.90226 nm" error for some pdb files. Can anyone explain what is
>>>>>> happening. Is there any restriction on number of nodes can be used?
>>>>>
>>>>>
>>>>> Yes. See discussion linked from 
>>>>> http://www.gromacs.org/Documentation/Errors
>>>>>
>>>>>
>>>>>> 3. I run the simulations with 1 way 96 (8 nodes each with 12 cores).
>>>>>> Its not clear to me from the log file whether Gromacs is able to
>>>>>> utilize all the 92 cores. It seems, it is using only 8 nodes.
>>>>>> Does Gromacs use both shared and distributed memory parallelism?
>>>>>
>>>>>
>>>>> Not at the moment. Look at the top of your .log file for clues about what
>>>>> your configuration is making available to GROMACS. It is likely that 
>>>>> mpirun
>>>>> -np 8 makes only 8 MPI processes available to GROMACS. Using more will
>>>>> require you to use your MPI installation correctly (and we can't help with
>>>>> that).
>>>>>
>>>>>
>>>>>> 4.   In the single-point energy  calculation "mdrun -s input.tpr
>>>>>> -rerun configuration.pdb", is the configuration.pdb mentioned  is the
>>>>>> original pdb file used on pdb2gmx  with -f option? Or its a modified
>>>>>> pdb file? I am asking because if I use the original file that does not
>>>>>> work always :-(
>>>>>
>>>>>
>>>>> It can be any configuration that matches the .top file you gave to grompp.
>>>>> That's the point - you only need one run input file to compute the energy 
>>>>> of
>>>>> any such configuration you later want. The configuration you gave to 
>>>>> grompp
>>>>> (or any other tool) doesn't matter.
>>>>>
>>>>>
>>>>>> 5. Is there any known speedup factor of Gromacs on multicores?
>>>>>
>>>>>
>>>>> That depends on your simulation system, hardware, network and algorithm.
>>>>> Don't bother with fewer than hundreds of atoms per core.
>>>>>
>>>>> Mark
>>>>> --
>>>>> gmx-users mailing list    gmx-users@gromacs.org
>>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>>> * Only plain text messages are allowed!
>>>>> * Please search the archive at
>>>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>>>> * Please don't post (un)subscribe requests to the list. Use the www
>>>>> interface or send it to gmx-users-requ...@gromacs.org.
>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>
>>>>
>>>>
>>>> --
>>>> Jesmin Jahan Tithi
>>>> PhD Student, CS
>>>> Stony Brook University, NY-11790.
>>>
>>>
>>>
>>
>> --
>> gmx-users mailing list    gmx-users@gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> * Only plain text messages are allowed!
>> * Please search the archive at 
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> * Please don't post (un)subscribe requests to the list. Use the www 
>> interface or send it to gmx-users-requ...@gromacs.org.
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
>
>
>
> --
> Jesmin Jahan Tithi
> PhD Student, CS
> Stony Brook University, NY-11790.
>



--
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to