Yet another approach that might work for you, depending on your
requirements, is to use a lazy sequence to access your data. I did that
for a load of Twitter data that would have been too large to hold in memory
at any one time.
Here's the relevant bit (I think), copied and pasted:
(defn out-
On Apr 9, 2012, at 10:05 PM, Andy Wu wrote:
> Hi there,
>
> I'm studying algo-class.org, and one of it's programming assignment
> gives you a file containing contents like below:
> 1 2
> 1 7
> 2 100
> ...
>
> There is roughly over 5 million lines, and i want to first construct a
> vector of vect
On Mon, Apr 9, 2012 at 10:05 PM, Andy Wu wrote:
> (def int-vec (with-open [rdr (clojure.java.io/reader "")]
> (doall (map convert (line-seq rdr)
This will convert all 5 million lines to a 5 million element vector of
vector pairs. That's certainly a lot of memory, and