Alex Martelli wrote: > Michael Tobis <[EMAIL PROTECTED]> wrote: > he can perfectly > well correct his misexpression if he cares to -- not my job to try to > read his mind and perform exegesis on his words.
Well, I hate to try to tell you your job, but it doesn't seem to be to be all that great of a marketing strategy to actively chase people away... Hey, he might have been a Nutshell customer. > I think it would have been true, but weak and insufficient. Not only > experienced Python users have that opinion: lack of declarations didn't > faze me even I was a total newbie It did me, and it did many others. Perhaps you are unrepresentative. It's one thing to say "no can do, sorry", it's another to say "you don't need this anyway and if you think you do you aren't worthy". In fact, it was your book I spent the most time thumbing through looking for the "use strict" equivalent that I was absolutely certain must exist. Hell, even Fortran eventually gave in to "IMPLICIT NONE". It's practically the only thing I've ever expected to find in Python that hasn't vastly exceeded my expectations, aand I'm sure Alexander is not the only person to be put off by it. In fact, I'd recommend a paragraph early in the Nutshell book saying "there are no declarations, no use strict, no implicit none, sorry, forget it", and an index listing under "declarations" pointing to a detailed exegesis of their nonexistence. It would have saved me some time. It's true that in some sense an assignment is all the declaration you need. I think Carl Banks's point (what we think of as assignment as a carryover from other languages is really rebinding, and in many cases can be avoided) is also helpful. But that doesn't make the "epselon" bug go away, and wanting to have a way to catch it quickly isn't, to my mind, obviously a criminal act. Also, based on what DogWalker demonstrates, it's really not that alien to Python and should be feasible. > > Also, the assertion that "Python has no declarations whatsoever" is no > > longer obviously true. In the 2.4 decorator syntax, a decorator line is > > not executable, but rather a modifier to a subsequent symbol binding. I > > call it a declaration. > > You may call it a strawberry, if you wish, but that doesn't mean it will > taste good with fresh cream. It's nothing more and nothing less than an > arguably weird syntax for a perfectly executable statement: This may well be true in implementation, but cognitively it is a declaration that modifies the reference and not the referent. I see that it is a big deal to ask for more of these, but I don't see why. > > Let me add that I remain unconvinced that a language cannot combine the > > best features of Python with very high performance, which is ultimately > > I'm also unconvinced. Fortunately, so is the EU, so they have approved > very substantial financing for the pypy project, which aims in good part > exactly at probing this issue. I hope this works out, but it's hard for me to see how pypy will avoid lots of hashing through dictionaries. I'm willing to help it by declaring an immutable reference. Here, don't look this up; it always points to that. I'm guessing that this will also be considered a bad idea, and maybe someday I'll understand why. I'm looking for insight, not controversy. > If any single individual can be called > the ideator of pypy, I think it's Armin Rigo, well-known for his > excellent psyco specializing-compiler for Python: I'm hoping I can make the meeting. Maybe proximity to the core group will help me approach the sort of enlightenment I seek. Just being in the vicinity of Ian Bicking's aura on occasion has been most inspiring. > Almost nobody really liked the splat-syntax for decorators, except of > course Guido, who's the only one who really counts (the BDFL). But that > was strictly a syntax-sugar issue Um, sugar isn't exactly what I'd call it. I think it matters a lot though. Python's being easy on the eyes is not a trivial advantage for some people, myself incuded. > If "declarative statement" means anything, I guess it means "having to > tell stuff to the compiler to be taken into account during compilation > but irrelevant at runtime". Python does have one such wart, the > 'global' statement, and it's just as ugly as one might imagine, but > fortunately quite marginal, so one can almost forget it. I am trying to talk about having expressive power in constraining references as well as the referents. Python studiously avoids this, but decorators change that. I am not deep enough into the mojo as yet to have more than a glimmer of an idea about the distinction you are making. It's not the one I'm trying to make. decorators may not be implemented as declarations, but they cognitively act as declarations, and that's what I care about here. > I have nothing against a declarative style _per se_ -- it just doesn't > fit Python's "everything happens at runtime" overall worldview, and that > simple and powerful worldview is a good part of what makes Python tick > SO well. I'm glad you have said something something I absolutely agree with. I'm alarmed at the suggestions here that class and def blocks are declarative. The fact that they're executable is really a core part of the beauty of Python. However, I don't see how an 'import strict' would necessarily violate this, nor an "import immutableref", which is something I would find useful in trying to wrestle with NumArray, and which a high-performance Python could (I think) use to advantage. Now I may be wrong; in fact I'd bet against me and in favor of you and Frederik if I had to bet. It's just that I don't see why I'm wrong. -- mt -- http://mail.python.org/mailman/listinfo/python-list