I am running MD simulations on Liquid/Liquid interfaces and measuring the
interfacial tension between them. I have found that the readings in NVT
simulations are close to experimental values, but have a lot of variation.
I run NPT simulations on the exact same system and find the results show
very
, it
will make the system closer to experimental values. I will give this a try
and see what happens. My question still remains - why do NPT and NVT
simulations give such different values for surface tension?
Denny Frost
On Sat, Mar 12, 2011 at 6:40 AM, aldi asmadi wrote:
> David,
>
>
Is that using anisotropic pressure coupling?
On Sat, Mar 12, 2011 at 8:55 AM, David van der Spoel
wrote:
> On 2011-03-12 16.45, Denny Frost wrote:
>
>> I have run NPT simulations using isotropic and semiisotropic coupling
>> with the same results. I have never done co
pling in the z direction, but gromacs 4.5.3 won't let you specify tau_p =
0. Any other way to do pressure coupling in just the z direction?
Denny
On Sat, Mar 12, 2011 at 8:59 AM, Denny Frost wrote:
> Is that using anisotropic pressure coupling?
>
>
> On Sat, Mar 12, 2011 at
, Mar 12, 2011 at 9:24 AM, David van der Spoel
wrote:
> On 2011-03-12 17.17, Denny Frost wrote:
>
>> Thanks for answering that question about dispersion, that makes sense.
>> Also, The values I currently get with NPT are around 58 mN/m, while the
>> average values I get fo
On 2011-03-12 17.28, Denny Frost wrote:
>
>> No, it requires six, acutally, for aniisotropic coupling. I decided to
>> use semi-isotropic coupling with the xy compressibilities set to 4.5e-15
>> (it won't accept 0). This should keep the walls parallel to the z axis
>&
Dear all,
I am trying to equilibrate a solvent of pure ionic liquid. The system
keeps exploding (after 2-5 ns) and I am not sure why, though I believe
coulombic interactions are to blame. This is because the Coul-SR term is
negative, but the Coul. recip term is very positive throughout the entire
.5e-5
; Generate velocites is on at 300 K.
gen_vel = yes
gen_temp= 300.0
gen_seed= -1
On Fri, Feb 24, 2012 at 12:17 AM, Dommert Florian <
domm...@icp.uni-stuttgart.de> wrote:
> On Thu, 2012-02-23 at 13:35 -0700, Denny Frost wrote:
> > Dear all
these have not
resolved this issue. How can I calculate the error in the electrostatic
force?
Denny
On Sat, Feb 25, 2012 at 4:43 AM, Dommert Florian <
domm...@icp.uni-stuttgart.de> wrote:
> On Fri, 2012-02-24 at 11:03 -0700, Denny Frost wrote:
> > Thank you both for your replies. I
Reciprocal space error estimate of 3.3 kJ/mol*nm.
On Wed, Feb 29, 2012 at 3:47 AM, Dommert Florian <
domm...@icp.uni-stuttgart.de> wrote:
> On Mon, 2012-02-27 at 11:05 -0700, Denny Frost wrote:
> > The ionic liquid is bistriflate N-methyl-N-propyl pyrrolidinium and
> > the force fiel
Can someone explain to me what "compute a radial membrane normal" means in
the g_order options. I would like to calculate S with respect to the
surface of a spherical particle instead of one of the axes. Is this what
it does?
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromac
Gromacs Users,
I am interested in implementing a thole polarization scheme in my
simulations of ionic liquids. However, the gromacs 4.5 manual does
not give much information on this feature beyond a brief mentioning.
An example of the implementation can be found in the mailing list at
http://list
Dear All,
I have been trying to install gromacs 4.5.3 on my imac with no success. I
followed the "quick and dirty instructions" and downloaded openmpi-1.4.2,
fftw-3.2.2, and gromacs-4.5.3. Unpacked them and ran ./config as explained
in the instructions. The openmpi installation says it was succe
; On 28/10/2011 4:57 AM, Denny Frost wrote:
>
> Dear All,
> I have been trying to install gromacs 4.5.3 on my imac with no success. I
> followed the "quick and dirty instructions" and downloaded openmpi-1.4.2,
> fftw-3.2.2, and gromacs-4.5.3. Unpacked them and ran ./conf
I am running a variety of NPT simulations with polar, non-polar, and ionic
compounds. Although my results for density agree well with experimental
values, the pressures I get from g_energy are off by 1 to 3 orders of
magnitude. In the log file, the pressure fluctuates around a lot from -400
to 40
9304
> -
> When the only tool you own is a hammer, every problem begins to resemble a
> nail.
>
>
>
> *From:* gmx-users-boun...@gromacs.org [mailto:
> gmx-users-boun...@gromacs.org] *On Behalf Of *Denny Frost
> *Sent:* Friday, 21 January 2
from -1000 to 1000, depending on the system. The simulation box is 8x8x8 nm
(roughly) and contains about 12,000 atoms
On Thu, Jan 20, 2011 at 3:09 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> I am running a variety of NPT simulations with polar, non-polar, and
yes
gen_temp= 300.0
gen_seed= 10
On Thu, Jan 20, 2011 at 3:49 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> from -1000 to 1000, depending on the system. The simulation box is 8x8x8
>> nm (roughly) and contains about 12,000 atoms
> When the only tool you own is a hammer, every problem begins to resemble a
> nail.
>
>
>
> *From:* gmx-users-boun...@gromacs.org [mailto:
> gmx-users-boun...@gromacs.org] *On Behalf Of *Denny Frost
> *Sent:* Friday, 21 January 2011 9:23 AM
> *To:* Discussion list for GR
I am taking over a project for a graduate student who did MD using Gromacs
3.3.3. I now run similar simulations with Gromacs 4.5.1 and find that they
run only about 1/2 to 1/3 as fast as the previous runs done in Gromacs
3.3.3. The runs have about the same number of atoms and both use opls force
Sciences, Monash University
> 381 Royal Parade, Parkville VIC 3010
> dallas.war...@monash.edu
>
> +61 3 9903 9304
> -
> When the only tool you own is a hammer, every problem begins to resemble a
> nail.
>
>
>
> *From:* gmx-users-boun...@gr
decomposition
There are: 12800 Atoms
Max number of connections per atom is 63
Total number of connections is 286400
Max number of graph edges per atom is 6
Total number of graph edges is 24800
On Thu, Jan 27, 2011 at 4:32 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>>
26 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> I just realized that that was a very old mdp file. Here is an mdp file
>> from my most recent run as well as what I think are the domain decomposition
>> statistics.
>>
>> mdp file:
>> tit
gromacs 4.5.1
On Fri, Jan 28, 2011 at 12:40 PM, Erik Marklund wrote:
> PME is still an Ewald sum.
>
> Erik
>
> Denny Frost skrev 2011-01-28 20.38:
>
> I don't have any domain decomposition information like that in my log file.
> That's worrisome. The only ot
he speed issues.
On Fri, Jan 28, 2011 at 12:46 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> gromacs 4.5.1
>>
>>
> Ah, what I posted was from 4.0.7. I wonder why that sort of output was
> eliminated in 4.5; it's quite useful. Sorry for leadi
all 8 nodes are running at full capacity, though
On Fri, Jan 28, 2011 at 1:13 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> Here's what I've got:
>>
>> M E G A - F L O P S A C C O U N T I N G
>>
>> RF=Reaction-Field FE=Free
s, but I think it's a separate
issue to take up with my supercomputing facility.
On Fri, Jan 28, 2011 at 1:18 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> all 8 nodes are running at full capacity, though
>>
>>
> What is your mdrun command lin
at 1:32 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> Here's my grompp command:
>>
>> grompp_d -nice 0 -v -f md.mdp -c ReadyForMD.gro -o md.tpr -p top.top
>>
>> and my mdrun command is this:
>> time mpiexec mdrun_mpi -np
In the log file, when gromacs specifies "Nodes," does it mean processors?
On Fri, Jan 28, 2011 at 1:44 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> I'm leaning toward the possibility that it is actually only running 8
>> copies of the same jo
Since gromacs allows you to use quite a few different force fields with
different naming schemes, how does it know (from reading the topology file)
which atoms are hydrogens to enforce the hbond constraints?
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listin
wrote:
> On 21/01/2011 10:12 AM, Denny Frost wrote:
>
> Sorry, I'm referring to a lot of runs here - some fluctuate more than
> others and some have greater average values than others. The average value
> is never greater than the maximum fluctuation in each run, so that is n
I don't have an RMS section on my log file.
The final xvg file that comes from g_energy is much too long to post here,
but contains exactly what is in the log file. The interactive output from
g_energy is, however, thus:
Pressure = 995.9 bar (error = 0.65 bar)
On Tue, Feb 1, 2011 at
er).
Here is my g_energy command:
g_energy -f md.edr -s md.tpr -o energy.xvg -b 19000 -e 2
I think I might just write a script file to parse the xvg file from g_energy
to get me the correct values.
On Tue, Feb 1, 2011 at 3:31 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
I am trying to start a run using domain decomposition on a 5x5x10 nm box
with about 26,000 atoms in it. I've tried running 8-16 pp nodes, but
gromacs always throws an error saying that there is no domain decomposition
compatible with this box and a minimum cell size of 6.728 nm. I've tried
many v
2.0
ref_p = 1.0
compressibility = 4.5e-5
; Generate velocites is off at 300 K.
gen_vel = yes
gen_temp= 300.0
gen_seed= 10
On Wed, Feb 9, 2011 at 1:39 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> I am trying
n Wed, Feb 9, 2011 at 1:56 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> I'm using version 4.5.3
>>
>> Here's the output from the log file from DD initiation to the error:
>>
>> Initializing Domain Decomposition on 8 nodes
>> Dynam
Is tpbconv with the "pbc" option the best way to make the molecules whole
again?
On Wed, Feb 9, 2011 at 2:06 PM, Justin A. Lemkul wrote:
>
>
> Denny Frost wrote:
>
>> This run is actually a combination of two 5x5x5 nm boxes, one if which was
>> previously
37 matches
Mail list logo