thank you Daniel, but the best I get from MaxBufferedDocs(1) is an OOM error
after trying 5 iterations of 10MB each in the JUnit test provided by Chris,
running inside Eclipse 3.1.
I had already tried with MaxBufferdDocs(2) with no success before I posted
the original post.
I also tried:
write
ext or binary source documents in the index and
memory usage
Date: Fri, 20 Jan 2006 18:35:41 -0800 (PST)
: otherwise I would have done so already. My real question is question
number
: one, which did not receive a reply, is there a formula that can tell me
if
: what is happening is reasonable and
: otherwise I would have done so already. My real question is question number
: one, which did not receive a reply, is there a formula that can tell me if
: what is happening is reasonable and to be expected, or am I doing something
I've never played with the binary fields much, nor have i ever t
nswer to question one is that there is no other alternative.
Cheers
From: "George Washington" <[EMAIL PROTECTED]>
Reply-To: java-user@lucene.apache.org
To: java-user@lucene.apache.org
Subject: Storing large text or binary source documents in the index and
memory usage
Date: F
@lucene.apache.org
Subject: Storing large text or binary source documents in the index and
memory usage
I would like to store large source documents (>10MB) in the index in
their
original form, i.e. as text for text documents or as byte[] for binary
documents.
I have no difficulty adding the sou
I would like to store large source documents (>10MB) in the index in their
original form, i.e. as text for text documents or as byte[] for binary
documents.
I have no difficulty adding the source document as a field to the Lucene
index document, but when I write the index document to the index I