"J. Clifford Dyer" <[EMAIL PROTECTED]> writes:
> If the issue is that it should
> remain an integer because that mimics how the computer works, than I
> think it is worth pointing out that allowing a conversion to a long also
> goes against how the computer works; the computer would have a register
> overflow.  If the issue is that in python the division operator has
> always performed integer division, and should not change, then I think
> we're talking about a philosophical opposition to Python 3 that goes far
> deeper than just how integer division behaves.

The issue is that python ints mostly act like mathematical integers
(modulo the artifact of the int/long dichotomy which is gradually
going away).  

Certainly, I'd expect that if x and y are both integers and x is an
exact multiple of y, then x/y will be computable and not overflow.
But try computing 10**5000 / 10**4000 under future division (that is
supposed to give a float).

> At any rate, the whole argument is irrelevant--python already supports
> floating point results for integer division, and that isn't going away,
> but neither is integer division that always results in integer results,
> so if you don't like getting floating point results, you never have to
> use the / operator.  

I don't know of any other languages that support exact integers but
give you a floating point quotient.  CL, Scheme, Haskell, and I think
ML all have exact rationals, which is what the thread is about.  All
those scary pitfalls of intermediate results becoming unboundedly
complex simply don't seem to happen.
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to