Gary: That's compelling indeed, and I will look into it more! Thanks,
-Lee PS would a call to vec do the same thing as "into []" here? On Jun 2, 2014, at 7:14 PM, Gary Johnson <gwjoh...@uvm.edu> wrote: > Hey Lee, > > I would second Jozef's suggestion that you look into using the reducers > library when you need non-lazy sequence operations. Although a major > motivation of Rich's work was clearly to enable easy parallel folding via > fork/join, the fold function is only one of many in this library. Think > instead that the main (philosophical) purpose of reducers is to decomplect > the reducing operation from the data representation it is acting on. And of > course, since reduce can be used to implement (virtually) any non-lazy > sequence operation, it stands to reason that reducers should be fully capable > of providing new implementations of many of these functions on top of reduce > (which it does). > > Importantly, whenever you will be chaining sequence operations together, > reducers should be more efficient than both the lazy sequence functions > (e.g., map, filter) and the eager vector-returning functions (e.g., mapv, > filterv). This is because a chain of reducing functions generate no > intermediate representations. > > <obligatory contrived example> > For example, let's say I wanted to sum the squares of all the even numbers > in a sequence called samples. > > Using lazy functions: (reduce + (map #(* % %) (filter even? samples))) > > Using non-lazy functions (reduce + (mapv #(* % %) (filterv even? samples))) > > Using reducers (aliased as r): (reduce + (r/map #(* % %) (r/filter even? > samples))) > </obligatory contrived example> > > If you need to collect the results of a sequence operation in a data > structure rather than reducing them to an atomic value, simply use into > rather than reduce (since into uses reduce under the hood). > > So to collect the squares of all the even numbers in the samples sequence, > just do this: > > (into [] (r/map #(* % %) (r/filter even? samples))) > > As just one sample point, when I updated a statistical fire analysis > algorithm that I wrote from using the lazy sequence functions to using the > reducers library, I experience a full order of magnitude speedup. This sped > up my runtime from ~6 hours to around 20 minutes. So please do yourself a > favor and give this library a close look. It has made worlds of difference > for some of my work. > > Good luck, > ~Gary -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en --- You received this message because you are subscribed to the Google Groups "Clojure" group. To unsubscribe from this group and stop receiving emails from it, send an email to clojure+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.