Interestingly, I had planned on using Jackson, but found that because my JSON data in fact was not always well-formed and needed minor cleaning steps (e.g. double newline without interleave commas between JSON chunks), I needed to create better chunks of well-formed JSON first in a streaming sort of way. Because my chunks now are considerably smaller, I am not yet sure if using Jackson on smaller chunks is as advantagous than the clojure.contrib.json. The inadvertent benefit of casting the problem into chunks of newline delimited records is that now there seem to be many ways to parallelize the processing to extract the values I need. -A
On May 31, 2:38 pm, Ulises <ulises.cerv...@gmail.com> wrote: > jackson can read/parse large JSON files through its streaming > API:http://wiki.fasterxml.com/JacksonInFiveMinutes#Streaming_API_Example > > U -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en