MRAB wrote:
Ethan Furman wrote:
Steven D'Aprano wrote:
A mistake is still a mistake even if it shared with others.
Treating its with a lead zero as octal was a design error when it was
first thought up
[snippage]
I have to disagree with you on this one. The computing world was
vastly different when that design decision was made. Space was at a
premium, programmers were not touch-typists, every character had to
count, and why in the world would somebody who had to use papertape or
punch cards add a lead zero without a *real* good reason? I submit
that that real good reason was to specify an octal literal, and not a
decimal literal.
Now many many years have passed, much has changed, and a leading zero
(like so much else) no longer makes the sense in once did --
especially in a very wide-spread and general purpose language like
Python. That does not mean it was not a very good decision at the time.
I think that it although it might have been reasonable when C was
invented, it wasn't a good idea when Python was invented.
Very good point. I was thinking Steven was talking about the earliest
case, as opposed to the earliest Python case. My apologies if I
misunderstood.
~Ethan~
--
http://mail.python.org/mailman/listinfo/python-list