RE: How best to handle a reasonable amount to data (25TB+)

2012-02-07 Thread Peter Miller
may help. Some. Try filling out the spreadsheet here: http://www.lucidimagination.com/blog/2011/09/14/estimating-memory-and-storage-for-lucenesolr/ and you'll swiftly find out how hard abstract estimations are Best Erick On Tue, Feb 7, 2012 at 9:07 PM, Peter Miller wrote: > Oops again! Turns

RE: How best to handle a reasonable amount to data (25TB+)

2012-02-07 Thread Peter Miller
a/Lucandra would be a better option anyways. If Cassandra offers some of the same advantage as OpenStack Swift object store does, then it should be the way to go. Still looking for thoughts... Thanks, The Captn -Original Message----- From: Peter Miller [mailto:peter.mil...@objectconsult

RE: How best to handle a reasonable amount to data (25TB+)

2012-02-07 Thread Peter Miller
Best Erick On Mon, Feb 6, 2012 at 11:17 PM, Peter Miller wrote: > Thanks for the response. Actually, I am more concerned with trying to use an > Object Store for the indexes. The next concern is the use of a local index > versus the sharded ones, but I'm more relaxed about that n

RE: How best to handle a reasonable amount to data (25TB+)

2012-02-06 Thread Peter Miller
nt to data (25TB+) it sounds not an issue of lucene but the logic of your app. if you're afraid too many docs in one index you can make multiple indexes. And then search across them, then merge, then over. On Mon, Feb 6, 2012 at 10:50 AM, Peter Miller < peter.mil...@objectconsulting.

How best to handle a reasonable amount to data (25TB+)

2012-02-05 Thread Peter Miller
Hi, I have a little bit of an unusual set of requirements, and I am looking for advice. I have researched the archives, and seen some relevant posts, but they are fairly old and not specifically a match, so I thought I would give this a try. We will eventually have about 50TB raw, non-searchab