On Wed, Mar 9, 2011 at 6:59 PM, Alan <a...@malloys.org> wrote: > Clever, but do we really want to encourage writing code that blows > infinite stack, by burying the problem until all of the JVM's memory > has been used up for stack? I agree there's a place for this sort of > thing, but I don't think we would want to make it any kind of default > for things like map/filter.
There are two situations this would affect. One, incorrect code that goes into infinite recursion would throw OOME instead of SOE. Two, semantically-correct code that happens to nest too deeply (which sometimes happens, especially with (reduce (mapcat ...))) would work instead of throw SOE. Currently, the latter can be worked around by throwing a doall into things, at the cost of no longer being lazy -- sometimes that solution causes OOME instead of SOE when we should really be able to accomplish our goals without either exception. I'm not averse to it being a doall-like thing that, instead of eagerly realizing a sequence, maintains its laziness but stack-extends that laziness (by maintaining some knowledge of stack depth and doing a-w-s-e on its lazy-seq function if it gets deep enough -- something like (let [step (fn somestuff) step (if too-deep #(a-w-s-e step %&) f)] (lazy-seq ...)). -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en