Re: The Concepts and Confusions of Prefix, Infix, Postfix and Fully Functional Notations

2007-06-10 Thread Reilly
On Jun 10, 3:11 pm, Larry Elmore <[EMAIL PROTECTED]>
wrote:
> Twisted wrote:
> > On Jun 9, 8:21 pm, "BCB" <[EMAIL PROTECTED]> wrote:
> >> "Paul McGuire" <[EMAIL PROTECTED]> wrote in message
>
> >>news:[EMAIL PROTECTED]
>
> >>> On Jun 9, 6:49 am, Lew <[EMAIL PROTECTED]> wrote:
> > In particular, Perl code looks more like line
> > noise than like code from any known programming language. ;))
>  Hmm - I know of APL and SNOBOL.
>  --
>  Lew
> >>> TECO editor commands.  I don't have direct experience with TECO, but
> >>> I've heard that a common diversion was to type random characters on
> >>> the command line, and see what the editor would do.
> >>> -- Paul
> >> J
>
> >>http://www.jsoftware.com/
>
> > Oh come on! Toy languages (such as any set of editor commands) and
> > joke languages (ala Intercal) don't count, even if they are
> > technically Turing-complete. ;)
>
> > Nor does anything that was designed for the every-character-at-a-
> > premium punch-card era, particularly if it is, or rhymes with,
> > "COBOL".
>
> > Those have excuses, like it's a joke or it's a constrained
> > environment. Perl, unfortunately, has no such excuses. If there were
> > such a thing as "embedded Perl", I'd have to hesitate here, but since
> > there isn't...
>
> Neither APL nor Snobol nor J are toy or joke languages.

I'd like register my agreement.  SNOBOL was a very sophisticated
language and way ahead of its time in many ways.  While it's not
really used anymore, SNOBOL's legacy does live on in languages that
are in wide use.

APL and it's successors (including J & K) are neither toys nor extinct
relics.  APL is still used in a variety of applications.  The price of
the last airline ticket you bought was probably determined by a yield
management application written in APL.  K was created in 1993 and Kx
systems has built an incredibly valuable company on top of it.

APL's terseness has more to do with the Iverson's notational goals
than economy with characters related to punchcards.  In fact, the
dominant languages of the punchcard era (COBOL & FORTRAN) are both
pretty verbose.

Lastly, ITS Teco wasn't a joke or toy language either.. It was
psychotically terse and virtually impenetrable to later review.  But
it wasn't a toy.  When I learned to use EMACS, it was still
implemented in ITS Teco.


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: merits of Lisp vs Python

2006-12-11 Thread Andrew Reilly
On Tue, 12 Dec 2006 16:35:48 +1300, greg wrote:

> When a Lisp compiler sees
> 
>(setq c (+ a b))
> 
> it can reasonably infer that the + is the built-in numeric
> addition operator. But a Python compiler seeing
> 
>c = a + b
> 
> can't tell *anything* about what the + means without
> knowing the types of a and b. They might be numbers, or
> strings, or lists, or some user-defined class with its
> own definition of addition.

That may be true, but lisp's numeric addition operator knows how to add
fixnums, bignums, rationals and whatever the lisp name for floating points
is (imprecise?) -- something that not many (if any) processor instruction
sets can manage.  So that's still type-dependent dispatch, which isn't
going to get us to the speeds that we actually see reported unless there's
extra stuff going on. Type inference? Declarations?

Cheers,

-- 
Andrew
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: merits of Lisp vs Python

2006-12-14 Thread Andrew Reilly
On Thu, 14 Dec 2006 03:01:46 -0500, Ken Tilton wrote:

> You just 
> aren't used to thinking at a level where one is writing code to write code.

Firstly, I'm looking into lisp because my current python project is too
full of boilerplate :-) and too slow.  Coming from a C and assembler
background, I'm *used* to meta-programming, and do it all the time.  I
even use python, Matlab and bash to write C, sometimes :-)

However, in this particular instance, I'm inclined to wonder why
meta-programming is the right answer, rather than just doing all of the
interpolation and what-not at run-time, based on a big table of your
algebra rules?  It's for output to a human, isn't it?  It's not as though
it needs to be particularly fast?

Maybe I'm just not digging the example sufficiently.  That's likely: I've
yet to write my first lisp program...

Cheers,

-- 
Andrew
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: merits of Lisp vs Python

2006-12-14 Thread Andrew Reilly
On Thu, 14 Dec 2006 04:06:26 -0500, Ken Tilton wrote:
> Ken Tilton wrote:
>> Andrew Reilly wrote:
>>> However, in this particular instance, I'm inclined to wonder why
>>> meta-programming is the right answer, rather than just doing all of the
>>> interpolation and what-not at run-time, based on a big table of your
>>> algebra rules? 
>> 
>> I am afraid I do not see what alternative you are suggesting. I 
>> especially do not see how interpolation is in play.
> 
> [Guessing pending your clarification] "Interpolation" does happen at 
> runtime. This not about the actually quite rare use of macrology to move 
> certain calculations to compile time, this is about getting dozens of 
> transformation-specifc rules written to fit into a larger mechanism (by 
> having the right arguments and returning the right kinds of results, 
> with a minimum of boilerplate and a maximum of resiliency in the face of 
> refactoring.
> 
> The reason I post macro expansions along with examples of the macro 
> being applied is so that one can see what code would have to be written 
> if I did not have the defskill macro to "write" them for me. I sugest 
> one start there, by comparing before and after.

Please pardon my woeful grasp of lisp: that's probably why I'm off-beam
here.  It seemed to me that the bulk of your macro-ified/templated version
was taking some text and a "reverse" operation and creating the methods of
an object, or generic functions or ??  Each skill seems to have a title, a
list of annotations, and a list of hints (and a reverse, which I don't
understand).  That all looks like data.  Couldn't you do that with a table
containing those fields, and key it off the defskill argument (or even the
title?) at startup?  Then you don't have to worry about re-factoring the
code: there's only going to be one piece of code, driven by a table.

I only mentioned interpolation because it seemed likely that you might
want to be mutating these strings to be more specific to what your student
was actually doing.  I didn't expect that "42" was necessarily the right
answer...

To back out a bit, here, and get back to the meat of the matter: if one is
using Python, then it's because one doesn't much care about performance,
and it's reasonable to do expansions, pattern matching and domain specific
language creation/use at run-time.  After all, that's how the language
itself works, mostly.

When one finds that one *does* care about performance, that doesn't leave
much wriggle room, though...

-- 
Andrew
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: The Importance of Terminology's Quality

2008-08-20 Thread Andrew Reilly
On Thu, 21 Aug 2008 02:36:39 +, sln wrote:

>>Whats os interresting about all this hullabaloo is that nobody has coded
>>machine code here, and know's squat about it.
>>
>>I'm not talking assembly language. Don't you know that there are
>>routines that program machine code? Yes, burned in, bitwise encodings
>>that enable machine instructions? Nothing below that.
>>
>>There is nobody here, who ever visited/replied with any thought
>>relavence that can be brought foward to any degree, meaning anything,
>>nobody
>>
>>sln
> 
> At most, your trying to validate you understanding. But you don't pose
> questions, you pose terse inflamatory declarations.
> 
> You make me sick!

Could you elaborate a little on what it is that you're upset about?  I 
suspect that there are probably quite a few readers of these posts that 
have designed and built their own processors, and coded them in their own 
machine language.  I have, and that was before FPGAs started to make that 
exercise quite commonplace.  But I don't see how that's at all relevant 
to the debate about the power or other characteristics of programming 
languages.  Certainly anyone who's programmed a machine in assembly 
language has a pretty fair understanding of what the machine and the 
machine language is doing, even though they don't choose to bang the bits 
together manually.

Hope you get better.

-- 
Andrew
--
http://mail.python.org/mailman/listinfo/python-list


Re: If Scheme is so good why MIT drops it?

2009-07-26 Thread Andrew Reilly
On Sun, 26 Jul 2009 09:31:06 -0400, Raffael Cavallaro wrote:

> On 2009-07-26 09:16:39 -0400, a...@pythoncraft.com (Aahz) said:
> 
>> There are plenty of expert C++
>> programmers who switched to Python;
> 
> "plenty" is an absolute term, not a relative term. I sincerely doubt
> that the majority of python users were formerly *expert* C++
> programmers.
> 
>> your thesis only applies to the
>> legions of people who found it difficult to learn C++ in the first
>> place.
> 
> No, my thesis applies to the overwhelming majority of programmers who
> found it more difficult to *master* (i.e., not merely use) C++ as
> opposed to mastering python. BTW, this is a *complement* not a dis;
> python is a better language than C++ precisely because it is more
> sensibly and elegantly designed than C++ and therefore easier to master.

Isn't it widely accepted that the number of people who have mastered C++ 
is about five?  All of the rest of us just struggle...

[I know enough of C++ to avoid it whenever I can, and to not use it for 
my own projects.  I'm happy with a mix of C, python and lisp(or scheme).]

-- 
Andrew
-- 
http://mail.python.org/mailman/listinfo/python-list