OK it sounds like you need to increase the RAM your JVM is allowed to use, or, make your documents smaller.

Mike

Aditi Goyal wrote:

Thanks for showing interest Mike.
The OOME comes in the middle of setting a value of one of the field in the doc. That field has a fairly large value. May be that could have been the
reason.?



On Fri, Oct 3, 2008 at 4:57 PM, Michael McCandless <
[EMAIL PROTECTED]> wrote:


Note that large stored fields do not use up any RAM in IndexWriter's RAM buffer because these stored fields are immediately written to the directory
and not stored in RAM for very long.

Aditi, I'd love to see the full stack trace of the OOME that was originally
hit if you still have it...

Mike


Ganesh wrote:

Single document of 16 MB seems to be big. I think you are trying to store
the entire document content. If it is so drop the stored field and store its reference information in the database, which could help to retreive the
content later.

Regards
Ganesh

----- Original Message ----- From: "Aditi Goyal" <[EMAIL PROTECTED] >
To: <java-user@lucene.apache.org>
Sent: Friday, October 03, 2008 3:03 PM
Subject: Re: Document larger than setRAMBufferSizeMB()


Thanks Anshum.
Although it raises another query, committing the current buffer will
commit
the docs before and what will happen to the current doc which threw an
error
while adding a field to it, will that also get committed in the half??

Thanks a lot
Aditi

On Fri, Oct 3, 2008 at 2:12 PM, Anshum <[EMAIL PROTECTED]> wrote:

Hi Aditi,

I guess increasing the buffer size would be a solution here, but in case
you
wouldn't know the expected max doc size. I guess the best way to handle
that
would be a regular try catch block in which you could commit the current
buffer. At the least you could just continue the loop after doing
whatever
you wish to do using an exception handling block.

--
Anshum Gupta
Naukri Labs!
http://ai-cafe.blogspot.com

The facts expressed here belong to everybody, the opinions to me. The
distinction is yours to draw............

On Fri, Oct 3, 2008 at 1:56 PM, Aditi Goyal <[EMAIL PROTECTED] >
wrote:

Hi Everyone,

I have an index which I am opening at one time only. I keep adding the
documents to it until I reach a limit of 500.
After this, I close the index and open it again. (This is done in
order
to
save time taken by opening and closing the index)
Also, I have set setRAMBufferSizeMB to 16MB.

If the document size itself is greater than 16MB what will happen in >
this
case??
It is throwing
java.lang.OutOfMemoryError: Java heap space
Now, my query is,
Can we change something in the way we parse/index to make it more >
memory
friendly so that it doesnt throw this exception.
And, Can it be caught and overcome gracefully?


Thanks a lot
Aditi



Send instant messages to your online friends
http://in.messenger.yahoo.com
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to