Austin Hastings writes:
> How do you handle operator precedence/associativity?
> 
> That is,
> 
>    $a + $b + $c
> 
> If you're going to vectorize, and combine, then you'll want to group. I
> think making the vectorizer a grouper as well kills two birds with one
> stone.
> 
>   $a + >>$b + $c<<
> 
> vs.
> 
>   $a +<< ($b + $c)

I have to agree with Larry here, the latter is much cleaner.

I'm actually starting to like this proposal.  I used to shiver at the
implementation of the old way, where people used the operator to group
arbitrary parts of the expression.  I wouldn't even know how to parse
that, much less interpret it when it's parsed.

Now, we have a clear way to call a method on a list of values:

    @list Â.method

And a clear way to call a list of methods on a value:

    $value. @methods

It's turning out pretty nice.

> > You might argue that we should force people to think of it one way or
> > the other.  But there's a reason that some people will think of it
> > one way while others will think of it the other way--I'd argue that
> > vectorization is not something that happens to *either* the operand
> > or the operator.  Vectorization is a different *relationship* between
> > the operator and the operand.  As such, I still think it belongs
> > between.
> >
> > Plus, in the symmetrical case, it *looks* symmetrical.  Marking the
> > args in front makes everything look asymmetrical whether it is or not.
> 
> Just a refresher, what *exactly* does vectorization do, again?  I think of
> it as smart list-plus-times behavior, but when we go into matrix arithmetic,
> that doesn't hold up. Luke?

Well, for being called "vector operators", they're ending up pretty
useless as far as working with mathematical vectors.  As a
mathematician, I'd want:

    @vec1 Â*Â @vec2

To do an inner or an outer product (probably outer, as it has no
implicit connotations of the + operator).  That is, it would come out
either a matrix or a scalar.

But there are other times when I'd want that to call the operator
respectively.  Then you get this for the inner product:

    sum(@vec1 Â*Â @vec2)

Which isn't so bad, after all.

Hmm, but if it does give an outer product, then a generic tensor product
is as easy as:

    reduce { $^a Â+Â $^b } @A Â*Â @B

And nobody can fathom how happy that would make me.

Also, you'd get the nice consistency that:

    @A Â+Â @B

Is the same as both:

    map { @A Â+ $^b } @B
    map { $^a +Â @B } @A

Which is undoubtedly what the mathematician would expect  (It's very
reminiscent of how junctions currently work).

But then there's the problem of how you express the oft-wanted:

    map -> $i { @A[$i] + @B[$i] } 0..^min([EMAIL PROTECTED], [EMAIL PROTECTED])

(Hopefully when @A and @B are of equal length).

Maybe there's a way to do it so that we can both be happy: one syntax
that does one thing, another that does the other.  Like:

    @A Â+Â @B           # One-at-a-time
    @A Â+Â @B           # Outer product

Or something.  Hmm, then both:

    @A Â+ $b
    @A Â+ $b

Would mean the same thing. 

Luke
Ã

Reply via email to