On Mon, Oct 12, 2009 at 4:02 PM, Anand Balachandran Pillai <abpil...@gmail.com> wrote: > > > On Mon, Oct 12, 2009 at 3:47 PM, Anand Chitipothu <anandol...@gmail.com> > wrote: >> >> On Mon, Oct 12, 2009 at 2:42 PM, Baiju Muthukadan <ba...@muthukadan.net> >> wrote: >> > http://bitcheese.net/wiki/nopython >> > >> > Don't start a flame war now, please ;) >> >> 2.3 - 3.4 and 2/3.0 in Python, Ruby and Haskell interpreters. >> >> $ python3.0 >> Python 3.0.1 (r301:69597, Feb 14 2009, 19:03:52) >> [GCC 4.0.1 (Apple Inc. build 5490)] on darwin >> Type "help", "copyright", "credits" or "license" for more information. >> >>> 2.3 - 3.4 >> -1.1000000000000001 >> >>> 2/3.0 >> 0.66666666666666663 >> >> $ irb >> >> 2.3 - 3.4 >> => -1.1 >> >> 2/3.0 >> => 0.666666666666667 >> >> ^D >> >> $ ghci >> GHCi, version 6.8.2: http://www.haskell.org/ghc/ :? for help >> Loading package base ... linking ... done. >> Prelude> 2.3 - 3.4 >> -1.1 >> Prelude> 2/3.0 >> 0.6666666666666666 >> Prelude> Leaving GHCi. >> >> It looks like number of decimal digits printed are 17 in Python, 16 in >> Haskell and 15 in Ruby. >> >> Is there any way to change that behavior in Python? > > Not in the interpreter AFAIK. In code, use Decimal type. > > import decimal > >>> x=decimal.Decimal('2.3') >>>> y=decimal.Decimal('3.4') >>>> x-y > Decimal("-1.1") > > I am not however a fan of the decimal module since it uses strings as the > base type.
What else should it use then ? Regards Rajeev J Sebastian _______________________________________________ BangPypers mailing list BangPypers@python.org http://mail.python.org/mailman/listinfo/bangpypers