On Mon, Jun 1, 2015 at 5:58 PM, Laura Creighton <l...@openend.se> wrote: > If you are giving a talk about Decimal -- and trying to stamp out the > inappropriate use of floats you have to first inform people that > what they learned as 'decimals' as children was not floating point, > despite the fact that we write them the same way. > > If I ever get the time machine, I am going back in time and demand that > floating point numbers be expressed as 12345:678 instead of 12345.678 > because it would save so much trouble. Never has the adage 'It's not > what you don't know, that bites you. It's what you know that ain't so.' > been more apt.
While I agree that there are problems with conflating float with "real number", I don't know that decimal.Decimal is actually going to solve that either; and using a colon as the decimal separator won't solve anything (the world already has two - "." in the US and "," in Europe - and adding a third with the same semantics of separating the greater-than-unit from the sub-unit digits won't instantly give new meaning to the way numbers are handled), so I would be whacking you over the head when you invent that time machine XKCD 716 style. decimal.Decimal() has its own peculiarities, ways in which it isn't the same as real numbers, so what's really needed is a talk about both of them (and fractions.Fraction for completeness) and how they all have their uses. People need to grok that what computers do with numbers is never quite what they're used to from grade school (although it can be close; bignum integer arithmetic is pretty reliable). ChrisA -- https://mail.python.org/mailman/listinfo/python-list