On Fri, Aug 11, 2000 at 01:20:36PM +0200, Bart Lateur wrote:
> I think we don't really need reduce(). There are well working,
> relatively simple, and *far more transparent* alternatives, at least,
> for Perl. We do have OP= operators, after all. For example:
>
> $total = 0;
> map { $total += $_ } @list; # if you insist...
> return $total;
Randal ???
> How is this so bad, compared to:
>
> reduce { $_[0] + $_[1] } @list
>
> Is the latter shorter? Hardly.
Yes.
> Would it be faster? I doubt it.
Yes. Just look at benchmark stats for the reduce in List::Util
use Benchmark;
use List::Util qw(reduce sum);
timethese(1000, {
'for' => sub { my $t=0; $t+=$_ for 0..5000; $t },
'reduce' => sub { reduce { $a + $b } 0..5000 },
'sum' => sub { sum 0..5000 },
});
Benchmark: timing 1000 iterations of for, reduce, sum...
for: 4 wallclock secs ( 4.03 usr + 0.00 sys = 4.03 CPU)
reduce: 3 wallclock secs ( 3.28 usr + 0.00 sys = 3.28 CPU)
sum: 0 wallclock secs ( 0.41 usr + 0.00 sys = 0.41 CPU)
Thats a 15% speedup and given that the compiler could recognize __OP__
and optimize it then there is a 100x speed up. I find that significant
> Is it
> more transparent to see what it does, or is supposed to do? NOT AT ALL.
That is in the eye of the beholder.
Graham.