it sounds not an issue of lucene but the logic of your app.
if you're afraid too many docs in one index you can make multiple indexes.
And then search across them, then merge, then over.
On Mon, Feb 6, 2012 at 10:50 AM, Peter Miller <
peter.mil...@objectconsulting.com.au> wrote:
> Hi,
>
> I have
Hi,
I have a little bit of an unusual set of requirements, and I am looking for
advice. I have researched the archives, and seen some relevant posts, but they
are fairly old and not specifically a match, so I thought I would give this a
try.
We will eventually have about 50TB raw, non-searchab
I was trying to, but don't know how to even I read some of your blogs.
On Sun, Feb 5, 2012 at 10:22 PM, Michael McCandless <
luc...@mikemccandless.com> wrote:
> Are you using near-real-time readers?
>
> (IndexReader.open(IndexWriter))
>
> Mike McCandless
>
> http://blog.mikemccandless.com
>
> On
Are you using near-real-time readers?
(IndexReader.open(IndexWriter))
Mike McCandless
http://blog.mikemccandless.com
On Sun, Feb 5, 2012 at 9:03 AM, Cheng wrote:
> Hi Uwe,
>
> My challenge is that I need to update/modify the indexes frequently while
> providing the search capability. I was try
Hi Uwe,
My challenge is that I need to update/modify the indexes frequently while
providing the search capability. I was trying to use FSDirectory, but found
out that the reading and writing from/to FSDirectory is unbearably slow. So
I now am trying the RAMDirectory, which is fast.
I don't know o
Hi Cheng,
It seems that you use a RAMDirectory for *caching*, otherwise it makes no
sense to write changes back. In recent Lucene versions, this is not a good
idea, especially for large indexes (RAMDirectory eats your heap space,
allocates millions of small byte[] arrays,...). If you need somethin