Dear all


I would like to calculate the standard deviation (as the error bar) for
dV/dlanda.xvg file. I used g_analyze command as the following:



g_analyze   -f    free_bi_0.9.xvg  -av  average_0.9

I got:

set       average              *standard  deviation*       *std. dev.  /
sqrt(n-1)    *…

SS1    6.053822e+01     3.062230e+01              1.936724e-02        …

Is the amount of in third (standard deviation) or fourth column (std. dev.  /
sqrt(n-1) ) better than to use as the standard errors?

I want to draw dG/d lambda via lambda and show error bar for free energy.



Thanks in advance

Afsaneh
--
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to