On Sun, 24 Feb 2008 10:09:37 -0800, Lie wrote: > On Feb 25, 12:46 am, Steve Holden <[EMAIL PROTECTED]> wrote: >> Lie wrote: >> > On Feb 18, 1:25 pm, Carl Banks <[EMAIL PROTECTED]> wrote: >> >> On Feb 17, 1:45 pm, Lie <[EMAIL PROTECTED]> wrote: >> >> >>>> Any iteration with repeated divisions and additions can thus run >> >>>> the denominators up. This sort of calculation is pretty common >> >>>> (examples: compound interest, numerical integration). >> >>> Wrong. Addition and subtraction would only grow the denominator up >> >>> to a certain limit >> >> I said repeated additions and divisions. >> >> > Repeated Addition and subtraction can't make fractions grow >> > infinitely, only multiplication and division could. >> >> On what basis is this claim made? >> >> (n1/d1) + (n2/d2) = ((n1*d2) + (n2*d1)) / (d1*d2) >> >> If d1 and d2 are mutually prime (have no common factors) then it is >> impossible to reduce the resulting fraction further in the general case >> (where n1 = n2 = 1, for example). >> >> >> Anyways, addition and subtraction can increase the denominator a lot >> >> if for some reason you are inputing numbers with many different >> >> denominators. >> >> > Up to a certain limit. After you reached the limit, the fraction >> > would always be simplifyable. >> >> Where does this magical "limit" appear from? >> >> > If the input numerator and denominator have a defined limit, repeated >> > addition and subtraction to another fraction will also have a defined >> > limit. >> >> Well I suppose is you limit the input denominators to n then you have a >> guarantee that the output denominators won't exceed n!, but that seems >> like a pretty poor guarantee to me. >> >> Am I wrong here? You seem to be putting out unsupportable assertions. >> Please justify them or stop making them. >> >> > Well, I do a test on my own fraction class. I found out that if we set a > limit to the numerators and denominators, the resulting output fraction > would have limit too. I can't grow my fraction any more than this limit > no matter how many iteration I do on them. I do the test is by something > like this (I don't have the source code with me right now, it's quite > long if it includes the fraction class, but I think you could use any > fraction class that automatically simplify itself, might post the real > code some time later): > > while True: > a = randomly do (a + b) or (a - b) > b = random fraction between [0-100]/[0-100] print a > > And this limit is much lower than n!. I think it's sum(primes(n)), but > I've got no proof for this one yet.
*jaw drops* Please stop trying to "help" convince people that rational classes are safe to use. That's the sort of "help" that we don't need. For the record, it is a perfectly good strategy to *artificially* limit the denominator of fractions to some maximum value. (That's more or less the equivalent of setting your floating point values to a maximum number of decimal places.) But without that artificial limit, repeated addition of fractions risks having the denominator increase without limit. -- Steven -- http://mail.python.org/mailman/listinfo/python-list