> Is there a way (perhaps a formulae) to accurately > judge the memory requirement for a Lucene index? > (May be based on number of documents or index > size etc?)
The short answer is no, although there are some things you can estimate based on the number of fields, terms etc. Sorting will use memory - maybe a lot. > Reason I am asking is that we had two indexes > running on separate Tomcat instances and we decided > to move both these webapps (Solr) to a single Tomcat > for effective memory sharing. However our JVM > memory allocation was not accurate enough and the > Indexes started running OutOfMemory errors on > our production environment. > > It would be much helpful if we can identify the > requirement for resources pro-actively. > > Any help on the matter much appreciated. > > We use: Solr 1.4, Java 1.6.0_20 You might get better answers on the Solr list. -- Ian. --------------------------------------------------------------------- To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org For additional commands, e-mail: java-user-h...@lucene.apache.org