On 2012-08-25 13:07, Luca Mollica wrote:
I have read a variation on this advice many. many times here:
"The ultimate choice of force field should be based on your reading and
understanding of their derivation and known applications/limitations,
all of
which comes from the literature. Choose
Beyond optimizing your number of cores and npme (by hand I guess in your case),
you can try hyperthreading (talk to your sysadmins if you can not simply run 2N
threads, where N is the number of cores. I found on a power 6 that if you have
N cores, it is optimal to run 2N-1 threads, rather than 2
Thanks for your reply.
Sincerely,
Shima
- Original Message -
From: Justin Lemkul
To: Discussion list for GROMACS users
Cc:
Sent: Saturday, August 25, 2012 9:16 PM
Subject: Re: [gmx-users] grompp in KALP15-DPPC
On 8/25/12 12:41 PM, Shima Arasteh wrote:
> The number of DPPC molec
On 8/25/12 12:41 PM, Shima Arasteh wrote:
The number of DPPC molecules were 126. I changed it to 128 in topol.top, then
the grompp didn't give me any fatal error. I had changed the number of DPPC
in top file in last step when the inflateGRO reporst that 2 DPPC were
removed. Now, 128 DPPC is acc
The number of DPPC molecules were 126. I changed it to 128 in topol.top, then
the grompp didn't give me any fatal error.
I had changed the number of DPPC in top file in last step when the inflateGRO
reporst that 2 DPPC were removed. Now, 128 DPPC is acceptable! What does it
mean? Is it sensible?
On 8/25/12 12:26 PM, Shima Arasteh wrote:
Hi,
I am doing the simulation of KALP15 in DPPC following the Justin's tutorial.
For the step of solvation I ran this command:
# genbox -cp system.gro -cs spc216.gro -o system_solv.gro -p topol.top
The output of this step is as here:
Reading solute co
Hi all ,
Can any one suggest me how to build elastic network model
in gromacs ( any literature,tutorial kind).
Thanks in advance,
K.Mohan
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
*
Hi,
I am doing the simulation of KALP15 in DPPC following the Justin's tutorial.
For the step of solvation I ran this command:
# genbox -cp system.gro -cs spc216.gro -o system_solv.gro -p topol.top
The output of this step is as here:
Reading solute configuration
frame t= 1.000
Containing 17503 at
On 8/25/12 11:50 AM, jesmin jahan wrote:
Hi Justin,
Thanks for your reply.
Do you mean that Gromacs does not support tesla M2090?
I have used the force-device=yes option. But the problem is, it runs
but does not give me GB-polarization energy, It only gives the the
potential energy.
I inten
Hi Justin,
Thanks for your reply.
Do you mean that Gromacs does not support tesla M2090?
I have used the force-device=yes option. But the problem is, it runs
but does not give me GB-polarization energy, It only gives the the
potential energy.
I intention was to calculate the GB-Energy. In .mdp
On 8/25/12 10:57 AM, jesmin jahan wrote:
Dear all,
I got the following error while running mdrun-gpu, I got the following error:
The selected GPU (#0, Tesla M2090) is not supported by Gromacs!
Although in the Gromacs site , it says that Tesla M2090 is supported.
Then, I have used mdrun-gpu -
Dear all,
I got the following error while running mdrun-gpu, I got the following error:
The selected GPU (#0, Tesla M2090) is not supported by Gromacs!
Although in the Gromacs site , it says that Tesla M2090 is supported.
Then, I have used mdrun-gpu -device
"OpenMM:platform=Cuda,memtest=15,devic
Hi ,
Thanks for your reply. I actually want to build an elastic
network model (ENM) for Protein containing 372 residues. According to
literature, ENM model only considers C-alpha atoms of protein and
discards rest of the atoms.The potential only considers bonded term
only. It does not
contain
Dear Uses:
I am using "User specified potential functions" with Gromacs-4.5.4. According
to the manual, I wrote a FORTRAN program to generate the table.xvg file. When I
ran mdrun, the error came out as below:
..
Only 7 columns on line 4199 in file table.xvg
Only 7 columns on line 4200 in fil
I have read a variation on this advice many. many times here:
"The ultimate choice of force field should be based on your reading and
understanding of their derivation and known applications/limitations, all of
which comes from the literature. Choose the one that you think is most
sound :)"
Hi,
In case you want the "bonds" to be dynamic I think the only way is to use
tabulated interactions. A bit tedious to set up, but doable. Note that this
will NOT emulate angles or dihedrals, just the stretching term. I can't
understand why you would want bonds between all atoms within a certai
I have read a variation on this advice many. many times here:
"The ultimate choice of force field should be based on your reading and
understanding of their derivation and known applications/limitations, all of
which comes from the literature. Choose the one that you think is most
sound :)"
I am
Dear gmx users,
I am using 4.5.5 of gromacs.
I was trying to use g_tune_pme for a simulation. I intend to execute
mdrun at multiple nodes with 12 cores each. Therefore, I would like to
optimize the number of pme nodes. I could execute g_tune_pme -np 12
md.tpr. But this will only find the optimal
Hi all,
I want build a toplogy for a protein , for which , each and
every atom has to make bonds with other atoms with in certain
specified cut-off distance. How can i do this ?
Please suggest me a way,
Thanks in advance,
Mohan
--
gmx-users mailing listgmx-users@gromacs.org
htt
Dear:
Our institute got a IBM Power 775 cluster and it claimed to be very
good. However, it doesn't support g_tune_pme. I use the following script
for job submission:
#@ job_name = gromacs_job
#@ output = gromacs.out
#@ error = gromacs.err
#@ class = kdm
#@ node = 4
#@ tasks_per_node = 3
20 matches
Mail list logo