When you get to very large size the efficiency of your data representation is more likely to be the defining factor that the data representation of the language that you are using. So, I don't think that there is a specific answer, other than general advice for optimisation. Try it first, and optimize it when you have no choice.
Phil Vjeran Marcinko <vmarci...@gmail.com> writes: > I am planning to play with implementing some giant in-memory index that is > basically tree-like structure containing counters on certain tree nodes, > and can aggregate billion data points and will probably consume tens of GBs > of RAM. > > Since space (memory)-efficiency is crucial here, I was wondering how good > Clojure is for this problem, and should I better just stick to plain java, > because it is well known that clojure's persistent data structures > sacrifice space (and some speed, but that is not such a big issue here) for > sake of immutability and good development practice? -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en --- You received this message because you are subscribed to the Google Groups "Clojure" group. To unsubscribe from this group and stop receiving emails from it, send an email to clojure+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.