Kay, the usual propagation-of-uncertainty formulae are based on a
first-order approximation of the Taylor series expansion, i.e. assuming that
2nd and higher order terms in the series are can be neglected. This is
clearly not the case if B is small relative to its uncertainty: you would
need to in
what I'm missing in those formulas, and in the Wikipedia, is a
discussion of the prerequisites - it seems to me that, roughly speaking,
if the standard deviation of B is as large or larger than the absolute
value of the mean of B, then we might divide by 0 when calculating A/B .
This should inf
Just so that an actual answer appears in the archives of the CCP4BB:
If you define C = A/B and also define sig(X) as the standard deviations of
"X", where X can be A,B or C, then you can get sig(C) from:
(sig(C)/C)^2 = (sig(A)/A)^2 + (sig(B)/B)^2
Note the subtle difference from the rule for pro
The short answer can be found in item 2 in this link:
http://science.widener.edu/svb/stats/error.html
The long answer is "I highly recommend Error Analysis by John Taylor:"
http://science.widener.edu/svb/stats/error.html
If you can find the first edition (which can fit in your pocket) then
http://en.wikipedia.org/wiki/Propagation_of_uncertainty
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of
capricy gao
Sent: Saturday, June 04, 2011 10:45 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: [ccp4bb] Question about the statistical analysis-might be a bit off
topic
If means and standard deviations of A and B are known, how to estimate the
variance of A/B?
Thanks.