hey chris,
   
  i will check and let you know just to make sure,
   
  basically i see the OS allocating memory (up to about 4GB) while loading the 
indexes to memory and then crashing on the TermInfosReader class.  what i 
noticed was that the crash occured when lucene tried to create a Term array 
with the following code
   
  new Term[indexSize]
   
  i assume, since this is an array java was trying to allocate consecutive 
blocks in memory and this is hard to find , even in a 16 GB RAM machine, 
especially since (if im not mistaken) indexSize here is the termEnum size 
(which in my case is rather large)
   
  i will get back to you about the one liner, if you have any other thoughts id 
be extremely happy to hear them as this problem is a Major road block 
   
  thanks a million
   
  

Chris Hostetter <[EMAIL PROTECTED]> wrote:
  
: i am recieving the following stack trace:
:
: JVMDUMP013I Processed Dump Event "uncaught", detail 
"java/lang/OutOfMemoryError".
: Exception in thread "main" java.lang.OutOfMemoryError
: at org.apache.lucene.index.TermInfosReader.readIndex(TermInfosReader.java:82)

is it possible that parts of your application are eating up all of the
heap in your JVM before this exception is encountered? Possibly by
opening a the index many times without closing it?

More specifically, if you write a 4 line app that does nothing by open
your index and then close it again, do you get an OOM? ...

public class Main {
public static void main(String[] args) throws Exception {
Searcher s = new IndexSearcher("/your/index/path");
s.close();
}
}



-Hoss


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



                
---------------------------------
Yahoo! Mail
Bring photos to life! New PhotoMail  makes sharing a breeze. 

Reply via email to