Just so that an actual answer appears in the archives of the CCP4BB: If you define C = A/B and also define sig(X) as the standard deviations of "X", where X can be A,B or C, then you can get sig(C) from:
(sig(C)/C)^2 = (sig(A)/A)^2 + (sig(B)/B)^2 Note the subtle difference from the rule for propagating errors through addition and subtraction! Basically, when you are adding or subtracting, the total error is: sig(A+B)^2 = sig(A)^2 + sig(B)^2 or: the square root of the sum of the squares of all the individual "sigmas". But, when multiplying or dividing it is the "percent error" rather than the "sigma" itself that you run through the root-sum-square calculation. It is interesting I think that you get the SAME rule for multiplying or dividing. Errors always increase. As usual, all this assumes that the errors come from a Gaussian (normal) distribution and that fluctuations in A are "uncorrelated" to fluctuations in B. Other distributions or correlated errors will have different propagation rules, and for those you might want to actually read a statistics book. -James Holton MAD Scientist On Sat, Jun 4, 2011 at 10:44 AM, capricy gao <capri...@yahoo.com> wrote: > > If means and standard deviations of A and B are known, how to estimate the > variance of A/B? > > Thanks. >