At 11:09 AM 8/16/00 -0400, John Porter wrote:
>Dan Sugalski wrote:
> >
> > Numbers and strings really aren't different things, at least not as far as
> > people are concerned. They are for machines, but computer languages
> > ultimately aren't for machines, they're for people.
>
>I guess I can't fault you for towing the party line on this...

Would you stop that? I'm not toeing anything. If I disagreed, believe me 
you'd hear about it.

>Strings and numbers are not *exactly* the same, even to humans, are they?

At many levels, yes they are. After a point they differ, but what you see 
written is pretty much treated the same. Most numbers have no solid meaning 
to people without actual though. IIRC you get zero, one, two, and lots.

>The difference between numbers and strings is analogous to --
>or, on further reflection, IDENTICAL to -- the difference between
>arrays and associative arrays.  (The former are numerically indexed,
>the latter indexed by strings.)

The analogy doesn't hold. And people treat arrays and hashes *very* 
differently, far more so than the trivial differences in the notation might 
lead you to believe.

> > I'm not presuming that, though there are plenty of languages already that
> > have no symbols. Perl's not one of them, though.
>
>Now you're appealing to the argument that "if we changed the language
>to be like that, it simply Wouldn't Be Perl."  Not buying it.

That's fine, I'm not selling it. It is, nonetheless, rather true.

> > It's going to always be more difficult. You need to *think* to turn a word
> > into a symbol. = is already a symbol. Less effort's needed.
>
>I guess I'm not sure what you're getting at here.
>
>In the expression C<foo( bar )>, bar is a symbol, regardless of its type.
>There's no "turning a word into a symbol" going on that I can see.

Not symbol to the computer, symbol to the *person*.

Human's higher-reasoning capabilities are all symbol based, and a lot of 
the brain is set to turn external stimuli into symbols to be processed. The 
visual cortex is good at recognizing things and tagging them with a symbol. 
When you see a dog, for example, it gets tagged with the symbol for dog. If 
you're familiar with it, it might get a more specific symbol--a breed, or 
even an individual.

That doesn't happen with words--they're already abstract symbols, though in 
a different way. Because of that they get recognized at a lower level and 
passed to the language centers for translation to symbols. That extra 
handoff and translation takes time and mental effort.

> > > > ...exploiting instinct and
> > > > inherent capabilities give you faster response times, and quicker
> > > > comprehension.
> > >
> > >Sure.  But "instinct and inherent capabilities" do not apply here.
> >
> > Yes, they do. People write source. People read source....
>
>Sure.  No argument there.  Nonetheless, humans certainly have no instincts,
>and very likely no inherent capabilities, relevant to computer programming,
>except for abstract reasoning, which IMHO does not favor one side of this
>argument over the other.

That's an incorrect presumption. Humans have instinct and inherent 
capabilities for symbol manipulation, language, and pattern matching. 
That's what's used to write programs and, while it's not ideally suited to 
the task, it's what we've got. Taking advantage of those human capabilities 
is one of the things that perl does better than most other languages.

                                        Dan

--------------------------------------"it's like this"-------------------
Dan Sugalski                          even samurai
[EMAIL PROTECTED]                         have teddy bears and even
                                      teddy bears get drunk

Reply via email to