-------- Original Message --------
Subject: Re: Comparisons and sorting of a numeric class....
Date: Mon, 26 Jan 2015 05:38:22 -0800
From: Andrew Robinson <andr...@r3dsolutions.com>
To: Steven D'Aprano <steve+comp.lang.pyt...@pearwood.info>
On 01/24/2015 12:27 AM, Steven D'Aprano wrote:
Andrew Robinson wrote:
But let me explain a bit more why I'm picking on Python: For even if we
set the electronic engineering concerns aside that I've raised (and they
are valid, as OOP is supposed to model reality, not reality be bent to
match OOP) -- People's facile explanations about why Python's version of
bool is the way it is -- still bothers me here in the python mail list
-- because people seem to have a very wrong idea about bool's nature as
a dualton being somehow justified solely by the fact that there are only
two values in Boolean logic;
Nobody has suggested that except you.
Yes, they did 'suggest' it.
Earlier I even stated that I didn't
know GvR's motivation in making True and False singletons, but suggested
that it might have been a matter of efficiency.
True... but you are not the only person on this list.
Although I seriously doubt that either efficiency, or memory
conservation plays a part in this in any meausrable manner. eg: You
accused me of 'pre-optimizing' earlier, and if anything looks like a
pre-optimization, it's the bool itself because computers have tons of
memory now, and I seriously doubt modern Python versions can even run in
a small micro-controler; eg: without an almost complete rewrite of the
language....and as far as efficiency, I know speed won't change
significantly whether or not a singleton, or multiple instances are
allowed except under very unusual circumstances. So, even now -- I have
no idea why Guido chose to make bool so restrictive other than because
he thought C++ was absolutely restrictive, when in fact C++'s type
system is more flexible than he seems to have noticed.
-- And I know
Charles bool didn't use singletons in his algebra, -- just read his
work and you'll see he never mentions them or describes them, but he
does actually use dozens of instances of the True and False objects he
was talking about -- for the obvious reason that he would have needed
special mirrors, dichroic or partially silvered, to be even able to
attempt to make one instance of True written on paper show up in
multiple places; And that's silly to do when there's no compelling
reason to do it.
In the words of physicist Wolfgang Pauli, that is not even wrong.
http://en.wikipedia.org/wiki/Not_even_wrong
I'm actually gobsmacked that you could seriously argue that because Boole
wrote down true and false (using whatever notation he choose) more than
once, that proves that they aren't singletons. That's as sensible as
claiming that if I write your name down twice, you must be two people.
Clearly: Charles bool did not use singletons, and you are wasting your
breath splitting hairs that are moot.
Yet -- people here seem to want to insist that the bool type with only
two instances is some kind of pure re-creation of what Charles Bool
did -- when clearly it isn't.
Nobody has argued that Boole (note the spelling of his name) considered True
and False to be singletons. Being a mathematician, he probably considered
that there is a single unique True value and a single unique False value,
in the same way that there is a single unique value pi (3.1415...) and a
single unique value 0.
The spelling caveat is great -- and in Python the object named in bool's
honor is spelled bool (lowercase too). ;) another point about the
inconsistency of the object with the historical author, I just love it,
which is part of why I'm going to keep on spelling it like that!!!! For
even the spelling, suggests Python is really acting like a lemming and
just doing bool because Guido though other lanugages do bool... so
I'll just continue because it's fitting that anyone who mocks my use of
the mis-spelling bool also mock's Python's.
But "the number of instances" and "singleton" are concepts from object
oriented programming, which didn't exist when Boole was alive.
Yep -- I made that point myself in an earlier e-mail. Do you feel
brilliant, or something, copying my remarks ?
It is not
even wrong to ask the question whether Boole thought of true and false to
be singleton objects. He no more had an opinion on that than Julius Caesar
had an opinion on whether the Falkland Islands belong to the UK or
Argentina.
Oooooh! So you want people to think you commune with the dead, and know
he never thought about it alive and/or dead? D'Aprano speaks
posthumously for Dr. bool ?
You're being very condescending and arrogant and arguing in pointless
circles!
I said, and I quote "He didn't use singletons in his algebra" -- if you
can show where he did, I'll retract my remark; otherwise -- you're
merely proving my point.
It's a amalgamation of enhancements such
as binary words instead of just True/False and many other things
(operators that work on words rather than single bits.). So -- I don't
see that Python's implementation of Bool is justified by either a purist
appeal to Charles bool, or by ignoring pragmatic concerns that have been
attached to Bool's work for years by Electrical Engineers in order to
make it more useful for practical computer problems. Yet these two
things are what this python list has sort of harped on.
Python's bools are objects, and they represent Boolean values, not strings,
It's far more likely in my experience that an object in a computer *at
least* represents what can actually be found in the code of that object,
rather than trying to claim objects in python have nothing to do with
what is found in the object. I *DO* know that the bool *object* in
python certainly does represent a string, in the repr() method; eg:
"True" or "False" strings to be precise.
So: Where exactly is this boolean 'value' in the computer that you are
harping on, and what does it consist of ? The moment you point out
where it is -- I will point out, that it is likely transmitted by wires
which were designed by engineers using HDL's, or else, you're going to
point to something containing a string, or an integer, but whatever it
is -- I know for sure it's not some mythical 'bool' in reality.
The bool values you seem to want to talk about exist only in your mind,
and you can't actually show them to me; they aren't real objects in
Python or a computer. Your whole argument is non-falsifiable and rather
pointless as far as I can tell.
not lists, not floats, and not three-value or four-value logic values. They
are designed for Boolean two-valued logic, like the bulk of programming
languages, not for simulating hardware.
That's a non-sequitur, and downright lame.
I showed Ian a way to do what I wanted to do in C++, so the bulk of
programming languages is irrelevant as each has their own quirks and
work arounds; and you obviously don't know that most engineers use
programmable logic in their projects, now, so they use HDL's in
*synthesis* mode not just 'simulation'. The vast majority of all
hardware projects done today are built using this supposed 'niche' you
talk about; I can, for example, build a complete web site using a
XILINX FPGA and a HDL language without ever using C/C++ or Python or
intel processor or any other dumbed down 'bool' based language you can
think of. The difference is that HDL's are meant to ultimately
instantiate objects into hardware rather than create a software solution.
...snip...
[...]
Where python is different from other languages regarding bool -- and
deserves a little extra picking on, is that Guido has imposed four
constraints simultaneously to bool which together cause a conflict
that I don't think (offhand) I've ever encountered in another language;
Definitely not in C/C++!
The four things are: 1 -- he cut off subtyping and created no alternate
method of doing type checking of duck-types,
The point of duck-typing is that you *don't* type check,
Gee... you are acting rather inconsistent for someone with so much
programming experience.
You do a duck type check by testing if it acts like a duck.... obviously!
But, I was also referring to how the built in Python type() function is
strict -- eg: a duck type "FAILS" a standard type check (in general).
There is no alternate way to get duck types to pass *existing* type()
checks -- and yes, Python DOES type() check in various places whether or
not I or you want it to! ; and other programmers can call type() in
their legacy code, too.
Hence, duck types often *don't work* because there is no way to
circumvent type() checks that already exist in the Python language, and
for this very reason a simple google search will reveal bug reports
where people are asking Guido to remove type checks altogether in
various places in python 3 precisely because duck types fail to work in
those places. ( I wouldn't mind if they were gone. )
I really don't have time for the rest of your email; it's pointless...
A lemming is, as a lemming does...
--
https://mail.python.org/mailman/listinfo/python-list