issues building a large index

2005-06-24 Thread Lokesh Bajaj
Hi; I am a newcomer to this list and trying out Lucene for the first time. It looks really useful and I am evaluating it for a potentially very large index that my company might need to build. As I was investigating using Lucene, I wanted to know what the performance of optimize/index merg

Re: issues building a large index

2005-06-26 Thread Lokesh Bajaj
seem to be doing anything useful. Lokesh Daniel Naber <[EMAIL PROTECTED]> wrote: On Saturday 25 June 2005 02:10, Lokesh Bajaj wrote: > 3] Does this seem like a JVM issue? Since its always pointing to a > native method, I am not really sure what to look for or debug. Does you JVM

RE: issues building a large index

2005-06-29 Thread Lokesh Bajaj
[mailto:[EMAIL PROTECTED] Sent: Monday, June 27, 2005 10:08 AM To: java-user@lucene.apache.org Subject: Re: issues building a large index Hi, Perhaps using hprof with cpu=samples may reveal more information about what CPU is doing. I think this is a valid use case. Otis --- Lokesh Bajaj

"docMap" array in SegmentMergeInfo

2005-07-13 Thread Lokesh Bajaj
I noticed the following code that builds the "docMap" array in SegmentMergeInfo.java for the case where some documents might be deleted from an index: // build array which maps document numbers around deletions if (reader.hasDeletions()) { int maxDoc = reader.maxDoc(); docM

Re: Problem with deleting and optimizing index

2005-07-24 Thread Lokesh Bajaj
Actually, you should probably not let your index grow beyond one-third the size of your disk. a] You start of with your original index b] During optimize, Lucene will initially write out files in non-compound file format. c] Lucene will than combine the non-compound file format into the compoun