Andrew Robinson wrote: > But let me explain a bit more why I'm picking on Python: For even if we > set the electronic engineering concerns aside that I've raised (and they > are valid, as OOP is supposed to model reality, not reality be bent to > match OOP) -- People's facile explanations about why Python's version of > bool is the way it is -- still bothers me here in the python mail list > -- because people seem to have a very wrong idea about bool's nature as > a dualton being somehow justified solely by the fact that there are only > two values in Boolean logic;
Nobody has suggested that except you. Earlier I even stated that I didn't know GvR's motivation in making True and False singletons, but suggested that it might have been a matter of efficiency. Right here, right now, the reason doesn't matter. It doesn't matter if some other language chooses differently, or that Python might have been designed differently. What matters is: (1) Python is the way it is, not the way it might have been had it been designed differently; (2) it isn't going to change; and (3) it doesn't matter for what you are trying to do. You have invented an imaginary problem then spent your time complaining that Python doesn't allow you to solve this non-existent problem in a specific way. [...] > -- And I know > Charles bool didn't use singletons in his algebra, -- just read his > work and you'll see he never mentions them or describes them, but he > does actually use dozens of instances of the True and False objects he > was talking about -- for the obvious reason that he would have needed > special mirrors, dichroic or partially silvered, to be even able to > attempt to make one instance of True written on paper show up in > multiple places; And that's silly to do when there's no compelling > reason to do it. In the words of physicist Wolfgang Pauli, that is not even wrong. http://en.wikipedia.org/wiki/Not_even_wrong I'm actually gobsmacked that you could seriously argue that because Boole wrote down true and false (using whatever notation he choose) more than once, that proves that they aren't singletons. That's as sensible as claiming that if I write your name down twice, you must be two people. > Yet -- people here seem to want to insist that the bool type with only > two instances is some kind of pure re-creation of what Charles Bool > did -- when clearly it isn't. Nobody has argued that Boole (note the spelling of his name) considered True and False to be singletons. Being a mathematician, he probably considered that there is a single unique True value and a single unique False value, in the same way that there is a single unique value pi (3.1415...) and a single unique value 0. But "the number of instances" and "singleton" are concepts from object oriented programming, which didn't exist when Boole was alive. It is not even wrong to ask the question whether Boole thought of true and false to be singleton objects. He no more had an opinion on that than Julius Caesar had an opinion on whether the Falkland Islands belong to the UK or Argentina. > It's a amalgamation of enhancements such > as binary words instead of just True/False and many other things > (operators that work on words rather than single bits.). So -- I don't > see that Python's implementation of Bool is justified by either a purist > appeal to Charles bool, or by ignoring pragmatic concerns that have been > attached to Bool's work for years by Electrical Engineers in order to > make it more useful for practical computer problems. Yet these two > things are what this python list has sort of harped on. Python's bools are objects, and they represent Boolean values, not strings, not lists, not floats, and not three-value or four-value logic values. They are designed for Boolean two-valued logic, like the bulk of programming languages, not for simulating hardware. You might as well be complaining that Python's bools are no use for doing vector arithmetic. Correct. That's not what they're designed for. If you want vectors, don't use a bool, write a vector class. Three- or four-value logic is of great use to the hardware engineer making electrical circuits, we get that, thank you. But they are of little use in solving most programming problems, which is why no general purpose programming language I am aware of supports non-Boolean logic in the language itself. Simulating physical hardware is a niche. I am impressed by your anecdote of running Python on an operating system running on a simulated computer, that's very impressive, but it is still a niche requirement. [...] > Where python is different from other languages regarding bool -- and > deserves a little extra picking on, is that Guido has imposed four > constraints simultaneously to bool which together cause a conflict > that I don't think (offhand) I've ever encountered in another language; > Definitely not in C/C++! > > The four things are: 1 -- he cut off subtyping and created no alternate > method of doing type checking of duck-types, The point of duck-typing is that you *don't* type check, not that there is a special kind of thing called a "duck-type" that you need to check for. Nor is there some sort of special test for "duck-typing" (although abstract base classes and isinstance comes close). If you write something like this: if isinstance(flag, bool): if flag: do_this() else: do_that() that is the opposite of duck-typing. This is duck-typing, where you just assume that your object is of the expected type and allow it to raise an exception if it isn't: if flag: do_this() else: do_that() Ironically, duck-typing bools is almost unique in Python in that it is virtually impossible for this to fail. By default, every object regardless of its type (list, int, str, one of your 4-value logic instances, everything) can successfully be used as if it were a bool. This makes your insistence that you have to subclass bool even more inexplicable. > 2 -- he does not allow > multiple instances, 3 -- he himself did not implement bool using the > standard alternative methodology to subclassing -- eg: as a composite > structure with delegation. 3 is irrelevant. Just because the bool type is "final" (in Java terms) and un-subclassable doesn't mean that its parent must also be "final". Subtyping int is allowed. There is no restriction on subtyping int to make bool, and indeed you can subtype int to make your four-value logic class if you wish. > 4. and he has made bool into the default > return type for base type comparison operators; which means that general > programmers expect a bool for base types and may check for it, even if > Python's built in functions do not. Believe me, nobody writes code like this: flag = x < y if not isinstance(flag, bool): raise TypeError('expected bool, got something else') if flag: print "x is less than y" when they can write: if x < y: print "x is less than y" Provided your 4-value logic instances support conversion to bool via the __bool__ special method (Python 3) or __nonzero__ special method (Python 2), it will be exceedingly rare that anyone will notice that your fuzzy-numbers return MaybeFalse instead of False (etc). The only time somebody will notice is if they pass the comparison result to a flag which insists on a bool and nothing but a bool: def somefunc(flag): if not instance(flag, bool): raise TypeError ... somefunc(fuzzy_a < fuzzy_b) But you can't solve every example of bad programming! What if the function was written like this? def somefunc(flag): if not (flag is True or flag is False): raise TypeError ... That will reject your MaybeFalse and MaybeTrue *no matter what you do*. (You cannot override identity checks in Python.) The only solution is to educate your users that, *if and only if* they need to pass a fuzzy-bool to something which requires an actual bool, they can explicitly normalise it using: somefunc(bool(fuzzy_a < fuzzy_b)) [...] > As far a subtyping goes; The very fact that Guido used subtyping to > create bool in the first place (subtype of int), precludes any real > claim that bool should not itself be subclassable That's simply wrong. Have you heard of the saying "Don't run with scissors", usually told to small children? (I know a nominally mature adult who stabbed himself after tripping over while running with scissors. Oh how we laughed. Fortunately his injury was mild and more embarrassing than life-threatening.) Scissors are usually made from pieces of steel. Would you argue this? "There is no prohibition on running with arbitrary pieces of steel. Scissors are made from steel. Therefore, that precludes any claim that we should not run with scissors." That int is subclassable has no bearing on whether or not bool (made from int via subclassing) is subclassable. > just because bools > only have two values; I mean, seriously -- Guido's 'strict' Bool is > already an impure form of OOP that is borderline hypocrisy, as it can > '+' to 2 or 3... Not in the least. Python's bools are "impure" only in the sense that they violate some people's expectations that bools should be an abstraction. In the case of Python, they are not abstractions, they are ints. (Specifically a subclass of int.) But by the rules of OOP, including the Liskov Substitution Principle, bools are perfectly good ints. Anywhere you use an int, you could use a bool instead, and they will work. Personally, as an ex-Pascal programmer, it took me a long time to get used to the fact that True and False weren't purely abstract values with no methods and no operations other than `or`, `and`, `if...else`. But I not only got used to it, but I came to appreciate that this is actually useful. > and many other things; and worse I've just come across > a couple of papers which suggest that Guido doesn't like subclassing > when Composite object structures could be used instead to replace the > subclass 'is' relationship with a 'has a' relationship. I would like to see these papers. -- Steven -- https://mail.python.org/mailman/listinfo/python-list