>
> > Mathematically, 1/0 is whatever you define it to be.
>
> Well, sure.  That's as axiomatic as saying, "mathematically, the
> number one is whatever you define it to be."  But a mathematical
> system that has a definition which is inconsistent with the rest of
> the system is a flawed one.  If you let 1/0 be *anything*, then
> ordinary algebraic logic falls apart.  Those silly proofs where it
> is "proven" that 1 = 2, 1 + 1 = 1, etc., all depend on division by
> zero being possible (regardless of what its value is).  You have to
> keep division by zero illegal to avoid these absurd results. 
> Hence, to my mind at least, exception-throwing or NaN is a better
> solution than infinity.
>

My point was that there is no stone-carved mandate of the ancient 
mathematicians saying whether the value of 1 / 0 is defined or not. I 
did not intend to say that you could assign it any value. 

It is general practice among mathematicians to say that is undefined, 
but it is also general practice among other respectable ocupations to 
say it is "something like infinite", and both approaches can be 
formalized.

But of course, as you said, this is largelly irrelevant to the actual 
discussion.  

My personal opinion is that a language that lets you add "apples" + 
"oranges" and get 0, shouldn't be too picky about 1 / 0 not being a 
"proper" number.

-angel

Reply via email to