On Fri, 04 Nov 2005 22:19:39 -0800, Paul Rubin wrote: > Steven D'Aprano <[EMAIL PROTECTED]> writes: >> > Next you get some performance gain by using gmpy to handle the long int >> > arithmetic, >> >> Then whatever happens next will be my own stupid fault for prematurely >> optimising code. > > Huh? There's nothing premature about using gmpy if you need better long int > performance. > It was written for a reason, after all.
Sure, but I would be willing to bet that incrementing a counter isn't it. >> What exactly is your point? That bugs can happen if the behaviour of your >> underlying libraries changes? > > That your initialization scheme is brittle--the idea of data > abstraction is being able to change object behaviors -without- making > surprising bugs like that one. You don't even need the contrived gmpy > example. You might replace the level number with, say, a list of > levels that have been visited. Do you expect level += 1 to still work when you change level to a list of levels? The problem with data abstraction is if you take it seriously, it means "You should be able to do anything with anything". If I change object.__dict__ to None, attribute lookup should work, yes? No? Then Python isn't sufficiently abstract. As soon as you accept that there are some things you can't do with some data, you have to stop abstracting. *Prematurely* locking yourself into one *specific* data structure is bad: as a basic principle, data abstraction is very valuable -- but in practice there comes a time where you have to say "Look, just choose a damn design and live with it." If you choose sensibly, then it won't matter if your counter is an int or a long or a float or a rational -- but you can't sensibly expect to change your counter to a binary tree without a major redesign of your code. I've watched developers with an obsession with data abstraction in practice. I've watched one comp sci graduate, the ink on his diploma not even dry yet, spend an hour mapping out state diagrams for a factorial function. Hello McFly? The customer is paying for this you know. Get a move on. I've written five different implementations of factorial in ten minutes, and while none of them worked with symbolic algebra I didn't need symbolic algebra support, so I lost nothing by not supporting it. So I hope you'll understand why I get a bad taste in my mouth when people start talking about data abstraction. > I don't think the culprit is the mutable/immutable distinction += > uses, though that is certainly somewhat odd. I think Antoon is on the > right track: namespaces in Python live in sort of a ghetto unbecoming > of how the Zen list describes them as a "honking great idea". These > things we call variables are boxed objects where the namespace is the > box. So having x+=y resolve x to a slot in a namespace before > incrementing that same slot by y, maybe better uses the notion of > namespaces than what happens now. Perhaps it does, but it breaks inheritance, which is more important than purity of namespace resolution. Practicality beats purity. > I'm too sleepy to see for sure > whether it gets rid of the mutable/immutable weirdness. What weirdness? What would be weird is if mutable and immutable objects worked the same as each other. They behave differently because they are different. If you fail to see that, you are guilty of excessive data abstraction. -- Steven. -- http://mail.python.org/mailman/listinfo/python-list