Thanks for the advice,
I think this may be a solution.
In case you've experimented with this setting, could you please tell me
what are the side effects of limitting segment size?
This will probably cause searches to run slower?
Markus Wiederkehr wrote:
I am not an expert, but maybe the occas
Gusenbauer Stefan wrote:
A few weeks before I had a similar problem too. I will write my problem
and the solution for it:
I'm indexing docs and every parsed document is stored in an ArrayList.
This solution worked for little directories with a little number of
files in it but when the things ar
Harald Stowasser wrote:
>Stanislav Jordanov schrieb:
>
>
>
>>High guys,
>>Building some huge index (about 500,000 docs totaling to 10megs of plain
>>text) we've run into the following problem:
>>Most of the time the IndexWriter process consumes a fairly small amount
>>of memory (about 32 megs).
Stanislav Jordanov schrieb:
> High guys,
> Building some huge index (about 500,000 docs totaling to 10megs of plain
> text) we've run into the following problem:
> Most of the time the IndexWriter process consumes a fairly small amount
> of memory (about 32 megs).
> However, as the index size grow
I am not an expert, but maybe the occasionally high memory usage is
because Lucene is merging multiple index segments together.
Maybe it would help if you set maxMergeDocs to 10,000 or something. In
your case that would mean that the minimum number of index segments
would be 50.
But again, this m