On 11/06/2012 10:56 PM, Demian Brecht wrote:
My question was *not* based on what I perceive to be intuitive
(although most of this thread has now seemed to devolve into that and
become more of a philosophical debate), but was based on what I
thought may have been inconsistent behaviour (which was quickly
cleared up with None being immutable and causing it to *seem* that the
behaviour was inconsistent to the forgetful mind).
I originally brought up "intuitive"; and I don't consider the word to
mean an "exclusive" BEST way -- I meant it to mean easily guessed or
understood. An intelligent person can see when there may be more than
one reasonable explanation -- ergo: I just called your OP intelligent,
even if you were wrong; and D'Aprano ripped you for being wrong.
The debate is degenerating because people are _subjectively_ judging
other people's intelligence.
The less intelligent a person is, the more black and white their
judgements _tend_ to be.
As you touch on here, "intuition" is entirely subjective. If you're
coming from a C/C++ background, I'd think that your intuition would be
that everything's passed by value unless explicitly stated.
Yup -- that's my achillies heel and bias, I'm afraid.
I learned basic, then assembly, and then pascal, and then fortran77 with
C (historically in that order)
In my view, pass by value vs. reference always exists at the
hardware/CPU level regarless of the language; and regardless of whether
the language hides the implementation details or not;
I'm an EE; I took software engineering to understand the clients who use
my hardware, and to make my hardware drivers understandable to them by
good programming practices. An EE's perspective often lead to doing
efficient things which are hard to understand; That's why I look for a
consensus (not a compromise) before implementing speed/memory
improvements and ways to clarify what is being done.
Someone coming from another background (Lua perhaps?) would likely
have entirely different intuition.
Yes, they might be ignorant of what LUA is doing at the hardware level;
even though it *is* doing it.
So while I prefer intuitively obvious behaviour where possible, it is not
the holy grail, and I am quite happy to give it up.
I fail to see where there has been any giving up on intuitiveness in the
context of this particular topic. In my mind, intuitiveness is generally born
of repetitiveness and consistency.
YES!!!! I think a good synonym would be habit; and when a habit is
good -- it's called strength, or "virtue"; When it's bad it's called
"vice" or "sin" or "bad programming habit." :)
Virtues don't waste people's time in debugging.
As everything in Python is a reference, it would seem to me to be
inconsistent to treat expressions such as [[obj]*4]*4 un-semantically
(Pythonically speaking) and making it *less* intuitive. I agree that Python
would definitely be worse off.
That's a fair opinion. I was pleasantly surprised when the third poster
actually answered the "WHY" question with the idea that Python always
copies by reference unless forced to do deep copy. That's intuitive,
and as a habit (not a requirement) Python implements things that way.
I've already raised the question about why one would want a multiplier
at all, if it were found that the main desired use case never *wants*
all objects to change together.
I laid out a potential modification of list comprensions; which, BTW,
copy by re-instantiating rather than reference; so the paradigm of
Python is wrong in that case.... But, I think the modifications in that
context can't be argued against as easily as list multiplication (For
the same reason that comprehensions already break the copy by reference
mold !!!! )
--
http://mail.python.org/mailman/listinfo/python-list