I've tested with a file with 1M numbers between 1..100.

The quick answer would be: use `say lines.sum`.  But since `[+]` is already 
internally optimized to `.sum`, that doesn't make much sense.

If you know that you will only see integer numbers, you can make it a bit 
faster by explicitly coercing to `Int` before summing:

    say lines.map(*.Int).sum

This takes it down from 2.9 seconds to 2.5 seconds for me.

A quicker way I found is to bypass the overhead of finding lines, by slurping 
the whole file in one go, then splitting on newline and then summing:

    say slurp.split("\n").sum

This brings it down from 2.9 to about 2 seconds.

Without resorting to NQP, I think that is what the current state of affairs is.

> On 23 Sep 2019, at 19:57, Marc Chantreux <e...@phear.org> wrote:
> 
> hello,
> 
> question: in raku, is there a faster solution than
> 
>  say [+] lines
> 
> long story;
> 
> here is a thread i would like to reply to
> 
> https://stackoverflow.com/questions/450799/shell-command-to-sum-integers-one-per-line
> 
> because:
> 
> * the perl5 answer is fast compared to the other dynamic langages
> * the shortest solution is very interesting: even if you have an average
>  level in dc (i think i have one), this is not that easy to read the
>  answer (commented in the post)
> 
>    dc -f infile -e '[+z1<r]srz1<rp'
> 
> obvioulsy: raku would be my prefered answer with:
> 
>   seq 100000000 | perl6 -e 'say [+] lines'
> 
> but ...
> 
>   seq 100000000 | time perl6 -e 'say [+] lines'
>   5000000050000000
>   perl6 -e 'say [+] lines'  1591,80s user 3,04s system 122% cpu 21:36,63 total
> 
>   seq 100000000 | time /usr/bin/perl6 -e 'say [+] lines'
> 5000000050000000
> /usr/bin/perl6 -e 'say [+] lines'  2171,01s user 4,64s system 129% cpu 
> 28:04,73 total
> 
> when
> 
>  perl6 -v
>    This is Rakudo version 2019.03.1 built on MoarVM version 2019.03
>    implementing Perl 6.d.
>  /usr/bin/perl6 -v
>    This is Rakudo version 2018.12 built on MoarVM version 2018.12
>    implementing Perl 6.d.
> 
> regards
> marc

Reply via email to