Also, are you indexing largish documents? Lucene must fully index the
doc, and then flush, so for such large docs it can easily use more
than the 50 MB buffer you allotted.
There were some recent memory leak fixes for such large documents, as
well, that you might be hitting. Which Lucene version
I will check it out!!
Saurabh Agarwal
On Thu, May 27, 2010 at 11:13 PM, Erick Erickson wrote:
> The larger your RAMbufferSize, the more memory you consume FWIW.
>
> OK, then, does it always OOM on the same document? Are you trying to index
> any particularly large documents?
>
> Erick
>
> On Thu
The larger your RAMbufferSize, the more memory you consume FWIW.
OK, then, does it always OOM on the same document? Are you trying to index
any particularly large documents?
Erick
On Thu, May 27, 2010 at 1:28 PM, Saurabh Agarwal wrote:
> RAMBufferSize id 50 Mb, i tried with 200 too
> the index
RAMBufferSize id 50 Mb, i tried with 200 too
the index is unoptimized
MergeFactor is Default 10 and I have not changed it
MaxBuffered Docs is also default
Saurabh Agarwal
On Thu, May 27, 2010 at 10:31 PM, Erick Erickson wrote:
> What have you set various indexwriter properties to? Particularly
What have you set various indexwriter properties to? Particularly
things like merge factor, max buffered docs and ram buffer size.
The first thing I'd look at is MergeFactor. From the JavaDocs:
Determines how often segment indices are merged by addDocument(). With
smaller values, less RAM is used
Hi,
when I am running Lucene on a 512 MB system.
I am getting the following error
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at
org.apache.lucene.index.DocumentsWriter$ByteBlockAllocator.getByteBlock(DocumentsWriter.java:1206)
and sometimes
An unexpected error h