Re: beginner clojure question: OutOfMemory error processing (slightly) large data file

2011-03-22 Thread Avram
Thanks, Stuart. > With Leiningen, you can add the :jvm-opts option in project.clj, Cool, this is what I was looking for :) >     (def signals (vec ...)) > > says that you want the entire result, as a vector, stored as the value of > the Var `signals`.  That means your entire result data must fi

Re: beginner clojure question: OutOfMemory error processing (slightly) large data file

2011-03-22 Thread Avram
Thanks, Ken. > You'll need to avoid holding onto the head of your line-seq, which > means you'll need to make multiple passes over the data, one for the > as, one for the bs, and etc., with the output a lazy seq of lazy seqs. Actually, it would be great to make separate, asynchronous passes for t

Re: beginner clojure question: OutOfMemory error processing (slightly) large data file

2011-03-22 Thread Stuart Sierra
Oh, and the standard JDK class java.util.zip.GZIPInputStream implements gzip decompression. -Stuart Sierra clojure.com -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from

Re: beginner clojure question: OutOfMemory error processing (slightly) large data file

2011-03-22 Thread Stuart Sierra
Hi Avram, Assuming you're using the Sun/Oracle JDK, you can increase the size of the Java heap with the -Xmx command-line option. For example: java -Xmx512mb -cp clojure.jar:your-source-dir clojure.main Will run Java with a 512 MB heap. This increases the amount of memory available to yo

Re: beginner clojure question: OutOfMemory error processing (slightly) large data file

2011-03-22 Thread Ken Wesson
On Tue, Mar 22, 2011 at 4:00 PM, Avram wrote: > Hi, > > I (still) consider myself new to clojure.  I am trying to read a 37Mb > file that will grow 500k every 2 days. I don't consider this to be > input large enough file to merit using Hadoop and I'd like to process > it in Clojure in an efficient

beginner clojure question: OutOfMemory error processing (slightly) large data file

2011-03-22 Thread Avram
Hi, I (still) consider myself new to clojure. I am trying to read a 37Mb file that will grow 500k every 2 days. I don't consider this to be input large enough file to merit using Hadoop and I'd like to process it in Clojure in an efficient, speedy, and idiomatic way. I simply want something akin