>From the article: "The combining fn must supply an identity value when 
called with no arguments"  This means that you can't use combining 
functions whose identity value can't be computed, but satisfies the proper 
rules.  E.g. set intersection:

(intersection) == e == the set of all possible elements of a set

Of course this arity doesn't exist, but the arity-1 could be viewed as 
shorthand for:
(intersection s) == s 
; == (intersection s e) == (intersection e s)  ; if e could actually be 
computed

So, the new lib behaves slightly differently than core/reduce here:
(use 'clojure.set)
(require '(clojure.core [reducers :as r]))
(reduce intersection [#{1 2} #{2 3}])  ;==> #{2}
(r/reduce intersection [#{1 2} #{2 3}])  ;==> throws ArityException
; for completeness
(reduce intersection []) ;==> throws ArityException
(r/reduce intersection []) ;==> throws ArityException

It might fix things to special-case empty collections and make the "leaves" 
of the recursion single elements, but maybe functions with these weird 
non-computable identity elements, like set intersection, are too rare to 
bother.  I can't think of another one off the top of my head.

--Leif

On Tuesday, May 8, 2012 11:20:37 AM UTC-4, Rich Hickey wrote:
>
> I'm happy to have pushed [1] today the beginnings of a new Clojure library 
> for higher-order manipulation of collections, based upon *reduce* and 
> *fold*. Of course, Clojure already has Lisp's *reduce*, which corresponds 
> to the traditional *foldl* of functional programming. *reduce* is based 
> upon sequences, as are many of the core functions of Clojure, like *map*, 
> *filter* etc. So, what could be better? It's a long story, so I'll give you 
> the ending first: 
>
> * There is a new namespace: clojure.core.reducers 
> * It contains new versions of *map*, *filter* etc based upon transforming 
> reducing functions - reducers 
> * It contains a new function, **fold**, which is a parallel reduce+combine 
> * *fold* uses **fork/join** when working with (the existing!) Clojure 
> vectors and maps 
> * Your new parallel code has exactly the same shape as your existing 
> seq-based code 
> * The reducers are composable 
> * Reducer implementations are primarily functional - no iterators 
> * The model uses regular data structures, not 'parallel collections' or 
> other OO malarkey 
> * It's fast, and can become faster still 
> * This is work-in-progress 
>
> I've described the library in more detail here: 
>
>
> http://clojure.com/blog/2012/05/08/reducers-a-library-and-model-for-collection-processing.html
>  
>
> Rich 
>
> [1] 
> https://github.com/clojure/clojure/commit/89e5dce0fdfec4bc09fa956512af08d8b14004f6
>  
>
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to