Bengt Richter wrote:
On Sat, 23 Apr 2005 22:45:14 -0400, Richard Blackwood <[EMAIL PROTECTED]> wrote:
Robert Kern wrote:
Richard Blackwood wrote:
To All:
Folks, I need your help. I have a friend who claims that if I write:
foo = 5
then foo is NOT a variable, necessarily. If you guys can define for
me what a variable is and what qualifications you have to back you, I
can pass this along to, hopefully, convince him that foo is indeed a
variable.
None of us can do that unless you tell us what he thinks the word
"variable" means. The terminology is a bit fluid. I suspect that your
friend applying a somewhat restricted notion of "variable" that
coincides with the behavior of variables in some other language.
Indeed, this language is math. My friend says that foo is a constant and
necessarily not a variable. If I had written foo = raw_input(), he would
say that foo is a variable. Which is perfectly fine except that he
insists that since programming came from math, the concept of variable
is necessarily the identical. This can not be true. For example, I may
define foo as being a dictionary, but I can not do this within math
because there is no concept of dictionaries within mathematics; yet foo
is a variable, a name bound to a value which can change.
Maybe he doesn't know that foo = 5 in Python is not an equation as in math,
but a Python source language statement to be translated to a step in some
processing sequence.
Tell him in Python foo is a member of one set and 5 is a member of another,
and foo = 5 expresses the step of putting them into correspondence
to define a mapping, not declaring them equal.
Could I honestly argue this to him? From what basis do I argue that it
is not an equation? In any event, he would likely (passionately)
disagree considering his notion that programming is an off-shoot of math
and thus at the fundamental level has identical concepts and rules.
Believe it or not, he used to be a programmer. Back in the day (while I
was getting my PhD in philosophy), he was a employed programmer using
Cobol, Fortran, and other languages like that. Did his seemingly
peculiar definition of variable exist at that time?
Even in math notation, ISTM important to distinguish between
a finger and what it may for the moment be pointing at.
Regards,
Bengt Richter
--
http://mail.python.org/mailman/listinfo/python-list