This is what I mean by "quick and dirty".
#!/bin/bash
# Reads the output of "g_sas" per residue (-or) and writes only the residues
you want.
# Diego E.B. Gomes, Roberto D. Lins, Pedro G. Pascutti, Thereza A. Soares
# mailto: di...@biof.ufrj.br
#1 ) To perform multiple g_sas (will write out
There's a quick & dirty workaround. You can write a script to "g_sas" each
frame individually, writing many output files. Then grab the results from there.
Sent from my iPhone
On Jul 7, 2011, at 9:17 PM, "Justin A. Lemkul" wrote:
>
>
> ahmet yıldırım wrote:
>> There are hydrophilic and hydr
get gromacs 4.04
Confirm if you changed the permissions correctly and can write
anywhere in /usr/local/
Are you doing this as 'root' or super user (sudo) ? Because if you
are just a regular user, will won't be able to change the permissions
in /usr/local/. So, configure gmx to be install
Did you download the "dssp" software from here ?
http://swift.cmbi.ru.nl/gv/dssp/HTML/dsspcmbi
On Mar 10, 2009, at 1:12 AM, Homa Azizian wrote:
Hi
when I use do_dssp the following error come:
Program do_dssp, VERSION 4.0.3
Source code file: do_dssp.c, line: 471
Fatal error:
Failed to execu
You mean that half the protein is in one side of your cubic box and
the other half is on the other side ?
Try removing PBC or fitting to a reference frame (like the one before
SA). Search the list archives, this is a very common question.
On Mar 9, 2009, at 10:17 PM, Homa Azizian wrote:
I
Hi Andrew, I have just repeated this tutorial. There is nothing wrong
with that.
Download all the files again and repeat the tutorial from the beginning.
It can be a file that was corrupted during your previous download or
you did something wrong in one of the steps.
You can copy&paste the
On Mar 9, 2009, at 5:12 PM, Mark Abraham wrote:
Xiang Mao wrote:
My PC has a intel core2 cpu. I compiled gromacs under cygwin, and
using MPICH2. After mpiboot, I use mpirun -n 2 mdrun_mpi to
run EM, MD.
OK that might be working right. Now you need to look in the log file
for an indi
Looks like you are using MPICH2 as "mpi" software.
Try including "mpirun" before mdrun_mpi.
mpirun -n 4 mdun_mpi -v -s topol.tpr
If that doesn't work you shoud run start the MPI DAEMON ( MPD ) before
mpirun:
mpdboot
mpirun -n 2 mdun_mpi -v -s topol.tpr
after you job finishes you might want
These results are not strange.
Performance results really depend on the size/setup of your system.
Next time use gmxbench so we can have a better reference.
Are you using gromacs 4.0.4 ? It scales much better than gromacs-3.x.x
versions. Anyway, this very bad scaling is normal over gigabit
Check the VMD list, you should find the script there.
On Feb 1, 2009, at 9:28 PM, Mark Abraham wrote:
nahren manuel wrote:
Dear Gromacs Users,
I have done PCA of my MD , I want to visually represent the motions
in terms of porcupine plots. I came across Dynamite (web server)
for this purpo
10 matches
Mail list logo