Dear all:
I would like to use charmm36 and POPC for membrane protein simulation. and I
am wondering where can I download charmm36 pre-pribriumed POPC PDB and topol
file for gromacs?
Thank you very much
Best
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/l
ing VMD (VIsual Molecular Dynamics)
2011/5/30 albert
Dear all:
I would like to use charmm36 and POPC for membrane protein simulation. and I
am wondering where can I download charmm36 pre-pribriumed POPC PDB and topol
file for gromacs?
Thank you very much
Best
--
gmx-users mailing listgmx-us
Hello:
I am wondering, is it possible to convert NAMD topol psf file into Gromacs
topol format?
Thank you very much
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/S
gromacs for non gromacs data. |WARNING
; WARNING| it cannot be used for a simulation. |WARNING
At 2011-05-30,"Francesco Oteri" wrote:
>Il 29/05/2011 21:58, albert ha scritto:
>> Hello:
>> I am wondering, is it possible to convert NAMD topol psf file in
t know if problems would arise for simulation.
>
>
>Il 29/05/2011 22:10, albert ha scritto:
>> Thank you very much for kind advices. Here is some warning, and I
>> don't know whether there would be some problem or not:
>>
>> ; 'fake' gromacs topology ge
difficult to modify them one by one.
Thank you very much
At 2011-05-30,"Francesco Oteri" wrote:
>You can solve the problem without converting from namd to gromacs.
>You can use the pdb you've already found to obtain a valid gromacs
>topology through pdb2gmx
>
>
em.
>
>
>Alternatively,you can replace atom name using some text file editor.
>
>
>
>
>Il 29/05/2011 22:35, albert ha scritto:
>> Well, I also try to do this. But it seem that the atom name in my POPC
>> pdb file (which I download from here
>> http://terpconnec
Thank you so much for your kind helps. did you pre-equilibrium it?
At 2011-05-30,"Jianguo Li" wrote:
Hi Albert,
Here is one gro file of 128 POPC lipids which I constructed before using
CHARMM36 FF. please check if it is correct before using it.
Jianguo
From: albert
To: Discu
Thank you so much for your such kind helps. I will try it.
At 2011-05-30,"Jianguo Li" wrote:
I equilibrated the system for about 20ns at 300K.
Jianguo
From: albert
To: Jianguo Li
Cc: Discussion list for GROMACS users
Sent: Monday, 30 May 2011 14:52:23
Subject: Re:Re: Re: [gmx-us
Thank you so much for kind helps. I will try it
At 2011-05-30,"Jianguo Li" wrote:
I equilibrated the system for about 20ns at 300K.
Jianguo
From: albert
To: Jianguo Li
Cc: Discussion list for GROMACS users
Sent: Monday, 30 May 2011 14:52:23
Subject: Re:Re: Re: [gmx-users] is i
)!
http://terpconnect.umd.edu/~jbklauda/research/download.html
Von:gmx-users-boun...@gromacs.org im Auftrag von albert
Gesendet: So 29.05.2011 21:23
An: Discussion list for GROMACS users
Betreff: Re:Re: [gmx-users] where can I download POPC membrane file?
But I don't think it is pre-equilibrium POPC mem
gt;there is this difference in names. The POPC structure from the link
>below still has the atom names of the old CHARMM27 lipids (DPPC is
>fine). As suggested, a simple script can do the conversion for you.
>
>Cheers
>
>Tom
>
>albert wrote:
>> I know this, but this file
Hi:
I am using the following command to submit gromacs md jobs in cluster:
mpirun -exe /opt/gromacs/4.5.5/bin/mdrun_mpi_bg -args "-nosum -dlb yes
-v -s nvt.tpr" -mode VN -np 128
Then I use command tail -f gromacs.out to check the performance of my
jobs and I get the following information:
Hello:
here is my log file for mdrun:
Writing final coordinates.
step 10, remaining runtime: 0 s
Average load imbalance: 10.8 %
Part of the total run time spent waiting due to load imbalance: 4.3 %
Steps where the load balancing was limited by -rdd, -rcon and/or -dds:
X 0 % Y 19 % Z
Hello:
I found that each time I would like to increase my nodes for MD run,
my job always failed. it said:
Will use 192 particle-particle and 64 PME only nodes
This is a guess, check the performance at the end of the log file
---
Program md
Hi:
I am following the tutorial:
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin/gmx-tutorials/lysozyme/07_equil2.html
the nvt step goes well, but the NPT always doesn't work. it said:
Program mdrun_mpi_bg, VERSION 4.5.5
Source code file: ../../../src/mdlib/domdec.c, line: 2633
Fat
thank you very much for kind reply.
I change my command as following:
mpirun -exe /opt/gromacs/4.5.5/bin/mdrun_mpi_bg -args "-nosum -dlb yes
-v -s npt.tpr -nt 1" -mode VN -np 256
the "-nt 1" option has been added above. but it still doesn't work and
here is the log file
Initializing Dom
is there any solution to fix this?
THX
On 01/06/2012 09:52 AM, gmx-users-requ...@gromacs.org wrote:
"The minimum cell size is controlled by the size of the largest charge
group or bonded interaction and the largest of rvdw, rlist and rcoulomb,
some other effects of bond constraints, and a saf
Hi:
I am following the tutorial:
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin/gmx-tutorials/lysozyme/07_equil2.html
the NVT step goes well, but the NPT always doesn't work. it said:
Program mdrun_mpi_bg, VERSION 4.5.5
Source code file: ../../../src/mdlib/domdec.c, line: 2633
Fa
Hello:
I am submiting gromacs in cluster and the job ALWAYS terminate with
following messages:
vol 0.75 imb F 5% pme/F 0.52 step 4200, will finish Sat Jan 7
09:36:14 2012
vol 0.77 imb F 6% pme/F 0.52 step 4300, will finish Sat Jan 7
09:36:28 2012
step 4389: Water molecule starting
ect
that around 50 ns time scale level, agonist bound GPCR should also expect
such kind of side chain switches and it may lost at 100 ns time scale?
I would be very appreciated if someone could give me some comments on
this issue.
Thank you very much
best wishes
Albert
--
gmx-users mailing li
ect
that around 50 ns time scale level, agonist bound GPCR should also expect
such kind of side chain switches and it may lost at 100 ns time scale?
I would be very appreciated if someone could give me some comments on
this issue.
Thank you very much
best wishes
Albert
--
gmx-users mailing li
hello:
I am trying to run do_dssp by command:
do_dssp -s md.tpr -f md.trr -b 400 -e 500 -o fws_ss.xpm
but it said:
Select a group: 1
Selected 1: 'Protein'
There are 35 residues in your selected group
trn version: GMX_trn_file (single precision)
Reading frame 400 time 400.000
Back Off! I j
Dear:
I am using gromacs for membrane simulation (under CHARMM36 FF) which
contains around 80,000 atoms. I've submitted over 200 CPU in the cluster
for such system with 2 fs time step. And what really astonished is that the
efficiency for such simulation is only 3ns/day. I am wondering what
Hello:
I am wondering is it possible to optimize hydrogen bonds network
before simulation? I've got some crystal solvent in the system and I
would like to optimize the hbond network even before building a solvent
system.
THX
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.
Hello:
there is some problem for my do_dssp, it always claimed:
Program do_dssp_d, VERSION 4.5.5
Source code file: do_dssp.c, line: 572
Fatal error:
Failed to execute command: /usr/local/bin/dssp -na ddg1g7Id ddRwthIi >
/dev/null 2> /dev/null
someone suggest to compile gromacs with double p
et="us-ascii"
And I replied "What's your dssp version? The most recent ones have different flags
that are not yet supported by gromacs."
Erik
30 mar 2012 kl. 13.23 skrev Albert:
Hello:
there is some problem for my do_dssp, it always claimed:
Program do_dssp_d, VER
I've tried: 2, 2.1.3, 2.1.4, the problem is still there. I don't think
the do_dssp is so difficult. just one command:
do_dssp -s md.tpr -f md.trr -b 100 -e 200 -f ss.xpm
On 03/30/2012 03:05 PM, Erik Marklund wrote:
And what versions were those?
30 mar 2012 kl. 15.06 skrev Alb
Hello:
I am wondering does anybody have experience with GROMACS for large
scale simulation? I've heard lot of people said that it would be
difficult for Gromacs to do so. eg: I've got a 60,000 atoms system, is
it possible for GROMACS to produce 100 ns/days or even more? suppose I
can use as
ce in Gromacs? Probably each of us can learn a lot from you.
thank you very much
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
P
Hello:
I am trying to run g_tune_pme in blue gene with following script:
# @ job_name = bm
# @ class = kdm-large
# @ account_no = G07-13
# @ error = gromacs.info
# @ output = gromacs.out
# @ environment = COPY_ALL
# @ wall_clock_limit = 160:00:00
# @ notification = error
# @ job_type = bluegen
Hello:
I am trying to test g_tune_pme in workstation by command:
g_tune_pme_d -v -s md.tpr -o bm.trr -cpi md.cpt -cpo bm.cpt -g bm.log
-launch -nt 16 &
but it stopped immediately with following logs. I complied gromacs with
a -d in each module such as mdrun_d and I aliased mdrun_d to mdrun
Dear:
I've generated a disoc.pdb file by concoord and does any one have
any idea how to analyze it by Gromacs g_cluster? when I read the manual
of g_cluter, it will require
-f traj.xtc Input, Opt. Trajectory: xtc trr trj gro g96 pdb cpt
-s topol.tpr Input, Opt. Structur
Hello:
I am using the following script to run Gromacs in cluster, but it failed:
# @ job_name = bm
# @ class = kdm-large
# @ error = gromacs.info
# @ output = gromacs.out
# @ environment = COPY_ALL
# @ wall_clock_limit = 10:00:00
# @ notification = error
# @ job_type = bluegene
# @ bg_size = 64
Hello:
thank you very much for kind reply.
I tried NVT before I produced NPT MD production.
thank you very much
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at
http://www.gromacs.org/Support/Mailing_List
hello:
thank you very much for your kind messages. I first did minimization,
the NVP with gradually heating the system from 0-310K, and them NPT
production:
NVT.mpd---
define = -DPOSRES -DPOSRES_LIG
constraints= hbonds
integrator= md
dt= 0.001 ; ps !
hello:
I am wondering where can we obtain the latest Amber ff12SB FF for
Gromacs?
thx
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Pleas
hello:
I am running a 60,000 atom system with 128 core in a blue gene
cluster. and it is only 1ns/day here is the script I used for
submitting jobs:
# @ job_name = gmx_test
# @ class = kdm-large
# @ error = gmx_test.err
# @ output = gmx_test.out
# @ wall_clock_limit = 00:20:00
# @ job_t
es bg_size.
Probably I should compile it by myself. Does anybody have any idea
how to compile gromacs in blue gene?
thank you very much
Albert
On 04/25/2012 12:31 AM, Dr. Vitaly V. Chaban wrote:
hello:
I am running a 60,000 atom system with 128 core in a blue gene
cluster. and it is on
ffler wrote:
On Tue, 24 Apr 2012 15:42:15 +0200
Albert wrote:
hello:
I am running a 60,000 atom system with 128 core in a blue gene
cluster. and it is only 1ns/day here is the script I used for
You don't give any information what exact system that is (L/P/Q?), if
you run singl
Hello:
I am running a membrane simulation with gromacs and I wondering how
to deal with energygrps? Should I put protein and lipids into one
energygrps? Or I should leave the lipids stay with solvent and ions?
thank you very much
best
Albert
--
gmx-users mailing listgmx-users
groups for above paramters
in NVT.mdp?
thank you very much
On 04/25/2012 04:13 PM, Justin A. Lemkul wrote:
On 4/25/12 10:07 AM, Albert wrote:
Hello:
I am running a membrane simulation with gromacs and I wondering how
to deal with
energygrps? Should I put protein and lipids into one
some
one claimed that they can use optimized PMF/F paramters to achieve even
100ns/day for 20,000 atoms system. I also try to use g_tune_pme to get a
better performance, but the results are not so satisfied.. Can you
give me some advices for this?
thank you very much.
best
Albert
On 04
Hello:
Does anybody have any idea how to run g_tune_pme in a cluster? I
tried many times with following command:
g_tune_pme_d -v -s npt_01.tpr -o npt_01.trr -cpo npt_01.cpt -g
npt_01.log -launch -nt 24 > log &
but it always failed.
Option Type Value Description
rrect mdrun / mpirun executables?
Carsten
On Apr 26, 2012, at 9:28 AM, Albert wrote:
Hello:
Does anybody have any idea how to run g_tune_pme in a cluster? I tried many
times with following command:
g_tune_pme_d -v -s npt_01.tpr -o npt_01.trr -cpo npt_01.cpt -g npt_01.log -launch -nt
24&g
it seesm to be good.
just one pieces of advices, why not use CHARMM36 for this tutorial ?
since it is the best FF for lipids currently.
On 04/26/2012 11:14 AM, Anirban Ghosh wrote:
Hi ALL,
I have prepared a step-wise tutorial for running a MD simulation of a
GPCR protein inserted in a lipid
dification by ourselves.
best
Albert
On 04/26/2012 11:53 AM, Anirban Ghosh wrote:
Hello Albert,
Thanks.
Yes, CHARMM36 indeed handles lipids very well. But currently GROMACS
4.5.5 provides only the option for CHARMM27 FF and I found that ff43a1
very well preserves the characters of both the
4.6-dev-20120423-25c75
GIT SHA1 hash:25c752a51955337dc61d80a180ed9efa26f2121f
Branched from:25c752a51955337dc61d80a180ed9efa26f2121f (0 newer
local commits)
Precision:single
Parallellization: thread_mpi
FFT Library: fftw3
On 04/26/2012 12:07 PM, Carsten Kutzner wrote:
On Apr 26, 2012, at 11:37
Hi Anirban:
how many ns/day for your simulations? Did you use PME?
best
Albert
On 04/26/2012 12:59 PM, Anirban Ghosh wrote:
Hello Albert,
Good to know that!
I have carried out simulations using this FF in the range of 600 ns.
Regards,
Anirban
--
gmx-users mailing listgmx-users
hello:
I am running NPT on a blue gene cluster, but the jobs always failed
with following messages. However, everything goes well if I run it on my
local cluster:
---log---
ol 0.66! imb F 6% pme/F 0.45 step 900, will finish Mon Apr 30 04:46:31 20
d any problem with it.
But I don't know why it doesn't work in the blue gene computer.
THX
ALbert
On 04/28/2012 07:36 AM, Mark Abraham wrote:
On 28/04/2012 2:04 PM, Albert wrote:
hello:
I am running NPT on a blue gene cluster, but the jobs always failed
with following me
d any problem with it.
But I don't know why it doesn't work in the blue gene computer.
THX
ALbert
On 04/28/2012 07:36 AM, Mark Abraham wrote:
On 28/04/2012 2:04 PM, Albert wrote:
hello:
I am running NPT on a blue gene cluster, but the jobs always failed
with following me
hello:
I wondering are the three thermostat methods: Langevin, Berendsen and
Nose-Hoover chain are all compatible with semi-isotropy coupling style?
If I would like to use semi-isotropy coupling method, which one would be
better?
thank you very much
best
Albert
--
gmx-users mailing list
Hello Flo:
thank you so much for your kind comments.
Yes, I would like to couple the pressure, it really helps a lot.
best
Albert
On 05/03/2012 10:40 AM, Dommert Florian wrote:
On Thu, 2012-05-03 at 07:32 +0200, Albert wrote:
hello:
I wondering are the three thermostat methods
hello:
I've finished a MD job and I am wondering how can we extract
individual pdb from trajectories in Gromacs? each time I always get a
single pdb contains lots of snapshots.
thank you very much
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromac
On 05/03/2012 05:12 PM, francesco oteri wrote:
In particular, look at the option -sep
thank you for kind reply. but how to superimposed the left snapshot with
the first one?
thanks again for helps
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/
thank you very much.
I found a problem : there is no option to select step. eg:
I would like to export one snapshot at each 10ps, I don't find such
kind of options..
THX
On 05/03/2012 05:21 PM, francesco oteri wrote:
-fit
2012/5/3 Albert mailto:mailmd2...@gmail.com>>
hello:
I am trying to use amber2xtc.py script to convert Amber MD system
into gromacs format by command:
python amber2xtc.py npt3.mdcrd apo.prmtop . *.rst md_gromacs
however, I got the following messages
--log
USAGE : python amber2xtc.py AMBERCRD AMBERTOP T
hello:
I am trying to use amber2xtc.py script to convert Amber MD system
into gromacs format by command:
python amber2xtc.py npt3.mdcrd apo.prmtop . *.rst md_gromacs
however, I got the following messages
--log
USAGE : python amber2xtc.py AMBERCRD AMBERTOP TR
hello:
I am running GMX-4.6 beta2 GPU work in a 24 CPU core workstation with
two GTX590, it stacked there without any output i.e the .xtc file size
is always 0 after hours of running. Here is the md.log file I found:
Using CUDA 8x8x8 non-bonded kernels
Potential shift: LJ r^-12: 0.112 r^-6
U setup with
-ntomp 12);
- try running in GPU emulation mode with the GMX_EMULATE_GPU=1 env. var
set (and to match closer the GPU setup with -ntomp 12)
- provide a backtrace (using gdb).
Cheers,
--
Szilárd
On Mon, Dec 17, 2012 at 5:37 PM, Albert wrote:
hello:
I am running GMX-4.6 beta2 GPU wor
On 12/17/2012 06:08 PM, Szilárd Páll wrote:
Hi,
How about GPU emulation or CPU-only runs? Also, please try setting the
number of therads to 1 (-ntomp 1).
--
Szilárd
hello:
I am running in GPU emulation mode with the GMX_EMULATE_GPU=1 env. var
set (and to match closer the GPU setup with -nt
well, that's one of the log files.
I've tried both
VERSION 4.6-dev-20121004-5d6c49d
VERSION 4.6-beta1
VERSION 4.6-beta2
and the latest 5.0 by git.
the problems are the same.:-(
On 12/17/2012 07:56 PM, Mark Abraham wrote:
On Mon, Dec 17, 2012 at 6:01 PM, Albert wrote:
>
zation and found that there is some problem with my
ligand. I regenerated the ligand toplogy with acpype, and resubmit for
mimization and NVT. Now it goes well. So probably the problems comes
from the incorrect ligand topolgy which make the system very unstable.
best
Albert
--
gmx-users ma
On 12/20/2012 09:13 AM, pcl wrote:
Well what works for me is I convert cgenff and merge it with charmm36 (you only
have to do this once per cgenff version), then I have paramchem generate cgenff
charges for the ligand. Then I convert the output of paramchem (charges) to
.rtp format. I also hav
ng on
this.
THX
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www
. This script is trying to generate something like
what we see in a complete forcefiled folder instead of a single .itp
file for ligand.
I am just wondering, how can we convert the output from CGenFF into a
single .itp file which is similar to the one from Swissparam?
thank you very much
best
On 12/26/2012 12:18 PM, Peter C. Lai wrote:
You don't. CGenFF is a forcefield, like CHARMM36. You install it, add rtp
entries then use pdb2gmx to generate a ligand's topology .itp file
THX
but the problem is how to use this script? I've already download the
latest CGenFF file from CHARMM F
y has
complete necessary information for paramters and topology). Probably one
can consider improve this script and export the output file as a single
.itp file.
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please se
On 12/26/2012 07:53 PM, David van der Spoel wrote:
Hey, it's open source. Let us know how it goes
you can simple create an account and login
https://www.paramchem.org/
after your login, click "upload molecule" in left panel. Now you will
see the option:
"Include parameters that are already
type
Hbonds with resdiues inside protein. I am just wondering how which
module of Gromacs can I use to indicate the solvent difference in
flexibility? Is it possible to calculate the entropy in certain region
(let's say: 20
thank you very much
best
Albert
--
gmx-users mailing list
definitively assigned from the information in your input
files. These guessed numbers might deviate from the mass
and radius of the atom type. Please check the output
files if necessary.
Assertion failed for "g" in file
/home/albert/Desktop/gromacs-4.6-beta3/
file: /home/albert/software/gromacs/src/tools/gmx_msd.c,
line: 739
Fatal error:
The index group does not consist of whole molecules
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
On 01/10/2013 11:14 AM, David van der Spoel wrote:
On 2013-01-10 10:45, Albert wrote:
Hello Justin and Leandro:
thanks a lot for kind advices. I am trying to us the g_msd to
calculate the density:
try g_msd -h
wrong tool.
that's strange. Here is the information which I think it is
How nice it is.
Cheers.
Albert
On 01/21/2013 09:09 AM, Mark Abraham wrote:
Hi GROMACS users,
The day is finally here - GROMACS 4.6 is out!
As you've probably heard by now, there are lots of wonderful new
performance features, including
* a native GPU implementation layer - thanks to
very much
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
w
probably you can try "catdcd"
On 01/21/2013 11:29 AM, Anna Marabotti wrote:
Dear gmx-users,
I followed the suggestions by Justin and Daniel to convert the
trajectories, but still Amber does not recognize the correct format
and complains about the fact that it does not find the correct box
di
/ for convenience.
-Justin
Hello Justin:
thank you very much for kind comments.
It works now.
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search
hello:
I would like to make statics for an atom along Z-axis. I am just
wondering how can I to do this in Gromacs?
thank you very much
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http
HI Erik:
thanks a lot for kind advices, I will try it.
best
Albert
On 01/24/2013 03:00 PM, Erik Marklund wrote:
g_traj -nox -noy if I recall correctly.
On Jan 21, 2013, at 4:10 PM, Albert wrote:
hello:
I would like to make statics for an atom along Z-axis. I am just
wondering how can I
Hello:
I am using make_ndx to make a index file in Gromacs 4.6,
make_ndx -f input.pdb
but it said:
Copied index group 1 'Protein'
Copied index group 25 'Water_and_ions'
One of your groups is not ascending
Group is empty
thank you very much
best
Albert
--
gmx-users
On 01/26/2013 06:53 PM, Justin Lemkul wrote:
What exactly did you enter at the make_ndx prompt?
-Justin
1|25
protein, water and ions
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs
On 01/26/2013 07:41 PM, Justin Lemkul wrote:
What types of ions do you have? I can reproduce this problem for a
protein with ions bound to it, which are numbered discontinuously with
water and ions in solution.
-Justin
thank you for kind reply.
I only have Na+ and Cl-.
best
Albert
On 01/26/2013 07:51 PM, Justin Lemkul wrote:
Can you please post the following:
1. The groups printed in the make_ndx prompt
2. The output of gmxcheck on an index file created from your
coordinate file (created simply by typing 'q' at the prompt, i.e. not
creating any special groups)
-Justi
FF for protein? Currently, there is only
CHARMM36 FF for lipids. It seems that the CHARMM36 FF for protein
introduced the nbfix term which is absent in any previous version of
CHARMM. probably this would take sometime to be introduced to Gromacs.
regards
Albert
--
gmx-users mailing listgmx
-stepout 100
Please cite:
Wolf et al, J Comp Chem 31 (2010) 2169-2174.
does it means that g_membed deprecated from Gromacs-4.6 and we must use
mdrun instead?
THX
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive
with errors.
not eough space for XTC?
However, when I run it in gromacs-4.5.6 by g_membed, it finished well.
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support
.
So I am just wondering what's happening?
thank you very much
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please
On 02/07/2013 11:03 AM, James Starlight wrote:
Hi Albert!
As I understood your correctly you have run simulations with your 2
GPU cards on Gromacs-beta but could not do it with final version
havent it?
not really. both versions could run with GPU. The 4.6 beta recognize my
number of GPU as 4
On 02/07/2013 11:28 AM, James Starlight wrote:
Also could you tell me what your system has performance (in gflops)
and what system you have simulated on it (average atom number,
presence of explicit membrane etc)?
it is something around 55,000 atoms with explicit membrane. I am using
Slipids F
orted |
+-+
here is the log for mdrun:
Program mdrun_mpi, VERSION 4.6
Source code file:
/home/albert/Documents/2013-02-06/gromacs-4.6/src/gmxlib/gmx_detect_hardware.c,
line: 356
Fatal error:
Incorrect launch configuration: mismat
Hi:
thanks for kind comments.
It works fine now after I recompiled Gromacs carefully.
best
Albert
On 02/08/2013 03:43 AM, Szilárd Páll wrote:
Hi,
If you have two GTX 590-s four devices should show up in nvidia-smi and
mdrun should also show four devices detected. As nvidia-smi shows only
efficiency 32 ns/day. That's really nice.
I don't know whether other users have similar experiences for this new
version.
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.g
1.680 3.925 8.606 0.5957 0.0504 -0.4752
1TYRHE1 14 1.728 3.974 8.689 1.1798 -3.6443 1.5406
THX
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support
file every 2 ps
probably that's the reason why it didn't have velocity informations
My md is still running, I am just wondering, is there any way to extract
the last snapshot into .gro file with velocity information?
thanks
best
Albert
--
gmx-users mailing listgmx-users@g
Do you have any idea
for this?
thanks again for kind helps.
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't
.
-Justin
IC. that's really helpful.
thanks a lot
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un
ering, can specify some parameters in the .mdp file
so that Gromacs can export a .gro file with velocity information every
20 ns?
thank you very much
best
Albert
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at
The easiest way for solution is to kill MacOS ans switch to Linux.
;-)
Albert
On 03/01/2013 06:03 PM, Szilárd Páll wrote:
Hi George,
As I said before, that just means that most probably the GPU driver is not
compatible with the CUDA runtime (libcudart) that you installed with the
CUDA
Hello:
I am wondering did the forcefiled was updated in this new version? eg:
did CHARMM36_protein embeded or the CHARMM36_lipids updated?
thank you very much
Albert
On 03/05/2013 08:14 PM, Mark Abraham wrote:
*Hi GROMACS users,
GROMACS 4.6.1 is officially released. It contains numerous
1 - 100 of 319 matches
Mail list logo