Sorry, I don't have any relevant experience to share. As an experiment, I suggest that you try deferring the sh execution. Maybe logging the generated commands to a script file rather than calling sh during processing. Then execute one big script file at the end. That should make the program easier to profile and debug.
On Feb 27, 2012, at 11:13 PM, Sunil S Nandihalli wrote: > Hi Everybody, > I am using lazy-seqs to join two very large csv files. I am very certain > that I am not holding on to any of the heads and If I did .. the jvm would be > out of memory far sooner than what I am seeing currently. The size of the > file is something like 73 G and the Ram allocated to the jvm is about 8G . It > seems like a very gradual leak. Has anybody else encountered similar > problems? In case some of you feel that my code might be the culprit, the > following gist has the source. > > https://gist.github.com/1929345 > > Thanks, > Sunil. -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en