A cautionary tale of what happens when religious wars enter programming debates. For all I know, Paul Rubin is intelligent, gentle, kind to animals and small children, generous, charitable and modest. But touch his religious belief in the necessity of "truly" private variables, and boy oh boy does he lose the plot.
First he compares using Python to being raped, earning (perhaps unfairly) at least one plonking, and then he shows that no matter how intelligent one might be, sense goes out the window when religion enters the door. See below. On Fri, 30 Sep 2005 05:23:35 -0700, Paul Rubin wrote: > Steven D'Aprano <[EMAIL PROTECTED]> writes: >> > It's not easy if the base classes change after you check your code in. >> > You shouldn't need to know about that if it happens. Modularity, remember? >> >> Yes. And if you are relying on a public method in a class, and somebody >> dynamically modifies that public method, your code will stop working too. > > I'm not talking about dynamic anything. I'm talking about a normal > software project where there are multiple people working on the code. Paul misses the point: in a language like Python that allows the dynamic modification of public methods (as well as private ones) you can't be sure with absolute 100% mathematical certainty that your public interface is the same at runtime as it was when you did your last code audit. Paul worries about the possibility that some calling function will clobber a private variable that must not be clobbered. Why is this threat more worrisome than some calling function clobbering a public variable that must not clobbered? Logically, it is not -- if a function stuffs garbage into Klass.public_variable and breaks your entire program, that is no better or no worse than a function stuffing garbage into Klass._private_variable. There may be languages that make it impossible for the caller to clobber the public interface. But Python is not one of those languages. Since you can't prevent people from clobbering your public interface (by accident or design), and must instead (1) rely on the convention "don't break my code!" and (2) test to be sure that people haven't broken your code. Why worry MORE that people might clobber your private implementation? You deal with that problem the same way that you deal with the first: by convention ("attributes starting with an underscore are private, don't clobber them or rely on them being there!") and by testing. > You write a class and carefully make sure that none of its private > variables collide with superclasses in modules that it imports. You > check in your code and go do something else. Then the person > maintaining the superclasses goes and changes how they use their > private variables. He doesn't look at your code since his modules > don't import yours. But now both your code and his are broken. I'm surprised that for somebody who is so worried about bugs, Paul is satisfied with the "I didn't change anything that anyone can see, so the code must still be working" school of testing. (Otherwise known as the "I never changed anything" bug.) If your development team uses test-driven development, you will catch this breakage immediately you check the code in. If you don't, you've got lots more bugs that you don't know about, so one more is no big deal. Problem solved. Next? >> perhaps that mathematical certainty is appropriate for >> your ICBM control system or nuclear reactor, but it is a needless >> affectation for (say) a word processor. > > Why on earth would you want to unnecessarily introduce random bugs > into a word processor or anything else? Paul puts words into my mouth: I never suggested that it was good to introduce bugs, unnecessarily or otherwise. What I said was that aiming for 100% mathematical certainty that there are no bugs in a word processor is pointless. No lives are at stake. The fate of nations doesn't rest on your ability to make sure that PyWord is 100% bug free. Relax dude, the world won't end if there is a bug or two in your code. Of course, if Paul had been paying attention, he would have realised that 100% certainty is not possible in general no matter what style of development you use. Perhaps folks have heard of the halting problem? In programs which are small enough for you to trace through every possible program path, you might be able to get that 100% certainty, but for the rest, nope, no way. (The halting problem doesn't say that no program can check the correctness of *any* program, only than no program can check the correctness of *all* programs.) You can reduce the probability of bugs by using best practices, but never reduce it to zero. How small do you need? Is it really shocking to suggest that while a one in a billion billion chance of a bug might be needed for a nuclear reactor, we might be satisfied with just one in a million for a word processor? > And what happened to the > marketing claims that Python was good for critical applications? Paul makes the mistake of thinking that the truth of a claim depends on what those consequences are. Paul is wrong to suggest that Python being a dynamic language means that Python is not suited for critical applications, but even if he were right, that would not effect the truth of my statement. You can write broken or malicious C code that passes both visual inspection and even the best of automated testing (see, for example, the Underhanded C Contest http://www.brainhz.com/underhanded/ and the Obfuscated C Contest http://www.ioccc.org/). Should we decide that C is not suitable for critical applications? Well, perhaps we should -- but we don't. And neither should we jump to that conclusion about Python. It is true that your critical application in Python can be broken by a piece of code that modifies your class' public interfaces. So? Your C application might write directly to memory, clobbering its entire internal structure, and erase your hard disk to boot just to be sure. What's important is not whether or not such a thing is possible, but whether or not your application actually does it. In any case, there is a difference between mission-critical applications (such as, perhaps, a word processor) and what are generally called "real time" critical systems such as those controlling nuclear power stations and the like. I doubt that Python runs on any real-time operating systems suitable for such critical systems, but perhaps I'm wrong. -- Steven. -- http://mail.python.org/mailman/listinfo/python-list