Dear List, I'm experiencing a memory leak and I don't understand why.
I have a bunch of 50 files on disk called "data-1.edn" through "data-50.edn". I perform the following code: (def all-processed-data (reduce (fn [ret f] (merge ret (process-data (clojure.tools.reader.edn/read-string (slurp f))))) {} file-list)) "file-list" is a sequence of java.io.File objects pointing towards the 50 data files. "process-data" is a function that produces a map from the data read. Although each individual datafile is about 100 Megabytes, the result of processing a datafile is a map of about 200k only, so that the map returned by the entire above expression is only about 10 Megabytes. Nevertheless, executing the above code fills up memory and eventually stops clojure from functioning. Why? Thanks a lot for any suggestions! Joachim. -- -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en --- You received this message because you are subscribed to the Google Groups "Clojure" group. To unsubscribe from this group and stop receiving emails from it, send an email to clojure+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/groups/opt_out.