Stanislav Jordanov schrieb: > High guys, > Building some huge index (about 500,000 docs totaling to 10megs of plain > text) we've run into the following problem: > Most of the time the IndexWriter process consumes a fairly small amount > of memory (about 32 megs). > However, as the index size grows, the memory usage sporadically bursts > to levels of (say) 1000 gigs and then falls back to its level. > The problem is that unless te process is started with some option like > -Xmx1000m this situation causes an OutOfMemoryException which terminates > the indexing process. > > My question is - is there a way to avoid it?
1. I start my programm with: java -Xms256M -Xmx512M -jar Suchmaschine.jar & This protect me now from OutOfMemoryException. After I use iterative-subroutines. 2. Free your variables as soon as possible. like "term=null;" This will help your Garbage-Collector! 3. Maybe you should watch totalMemory and R.freeMemory() from Runtime.getRuntime() That will help you to find the "Memory-dissipater" 4. I had the problem when deleting Documents from Index. I used a Subroutine to delete single Documents. It runs much better when I replaced it into a "iterative" subroutine like this: public int deleteMany(String keywords) { int anzahl=0; try { openReader(); String[] temp = keywords.split(","); //Runtime R = Runtime.getRuntime(); for (int i = 0 ; i < temp.length ; i++) { Term term =new Term("keyword",temp[i]); anzahl+= mReader.delete(term); term=null; /*System.out.println("deleted " + temp[i] +" t:"+R.totalMemory() +" f:"+R.freeMemory() +" m"+R.maxMemory()); */ } close(); } catch (Exception e){ cIdowa.error( "Could not delete Documents:" + keywords +". Because:"+ e.getMessage() + "\n" +e.toString() ); } return anzahl; }
signature.asc
Description: OpenPGP digital signature