En Tue, 04 Mar 2008 11:46:48 -0200, NickC <[EMAIL PROTECTED]> escribi�:
> A mildly interesting Py3k experiment: > > Python 3.0a3+ (py3k:61229, Mar 4 2008, 21:38:15) > [GCC 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. >>>> from fractions import Fraction >>>> from decimal import Decimal >>>> def check_accuracy(num_type, max_val=1000): > ... wrong = 0 > ... for x in range(1, max_val): > ... for y in range(1, max_val): > ... wrong += (x / num_type(y)) * y != x > ... return wrong > ... >>>> check_accuracy(float) > 101502 >>>> check_accuracy(Decimal) > 310013 >>>> check_accuracy(Fraction) > 0 > > > The conclusions I came to based on running that experiment are: > - Decimal actually appears to suffer more rounding problems than float > for rational arithmetic Mmm, but I doubt that counting how many times the results are equal, is the right way to evaluate "accuracy". A stopped clock shows the right time twice a day; a clock that loses one minute per day shows the right time once every two years. Clearly the stopped clock is much better! http://mybanyantree.wordpress.com/category/lewis-carrol/ -- Gabriel Genellina -- http://mail.python.org/mailman/listinfo/python-list