On 11 Aug 2000 09:30:03 +0300, Ariel Scolnicov wrote (and quoted):

>>      reduce &avg $identity, @list

>This was my first point regarding C<reduce> -- not all functions have
>an identity element.  One should note that in general
>
>    (reduce &avg $x,@list) != (reduce &sum 0,@list)/@list
>
>for I<any> value of $x; your reduction is I<not> the right way to
>compute an average (I don't know if it was meant to be or not, but it
>got me).
>
>If you can tell me why you wish to perform this reduction, you should
>also be able to figure out an identity element.  You're computing
>
>    (($list[0]+$list[1])/2 + $list[2])/2 + ...
>
>for some reason. 

I have some reservations about this reduce() thing. Plain and simple
incorporating it into the core language would introduce the chance for
lots of buggy programs, like the example Arial gave. In fact, I think
that *most* programs that use reduce() would be buggy. But there is more
still.

On a more theoretical plane, what does

        reduce { $_[0] OP $_[1] } $x, $y, $z;

represent? Is it the programmer's idea of this?

        $x OP $y OP $z

But, is that:

        ($x OP $y) OP $z

or

        $x OP ($y OP $z)

We are used to thinking of associative operators, like "+" and "*",
where it makes no difference. A simple replacement by "-" is already a
counter example. The "average" example above, is too. (An "operator" and
a "function" are basically the same thing with a different notation.) I
think that *virtually all* programs that people would write using
reduce(), would fall into this category.

Besides, in functions like map() and grep(), execution order shouldn't
matter. I think that *still* the processing order of the arguments isn't
officially stated as "always from left to right" (as with the coma
operator), even though highly respected Perl hackers like Abigail (see
comp.lang.perl.misc) strongly depend on it. So, what's it going to be
here? I think that if your code depends on execution order, you
shouldn't be using this functional programming paradigm.

I think we don't really need reduce(). There are well working,
relatively simple, and *far more transparent* alternatives, at least,
for Perl. We do have OP= operators, after all. For example:

        $total = 0;
        map { $total += $_ } @list;  # if you insist...
        return $total;

How is this so bad, compared to:

        reduce { $_[0] + $_[1] } @list

Is the latter shorter? Hardly. Would it be faster? I doubt it. Is it
more transparent to see what it does, or is supposed to do? NOT AT ALL.

-- 
        Bart.

Reply via email to