Hi, Thanks for your help Erick. I changed my code to flush writer before document add which helps to reduce memory usage. Also reducing mergefactor and max buffered docs to some level help me to avoid this OOM error (eventhough index size is ~1GB).
But please clarify below doubts Make sure you flush your IndexWriter before attempting to index this document. - Is it good to call writer.flush() before adding every document into writer? Doesn't it affect performance of indexing or search? Whether it's also similar to setting MaxBufferDocs=1? Also guide me which one is relatively good (take less time & memory) among this (i) create 4 indexes each of 250MB and merge them to single index file by using writer.addIndexes(..) (ii) create a 1GB index & optimize it? Thanks & Regards RSK On Feb 4, 2008 9:23 PM, Erick Erickson <[EMAIL PROTECTED]> wrote: > ummmm index smaller documents? <G> > > You cannot expect to index a 1G doc with 512M of memory in the JVM. > The first thing I'd try is upping your JVM memory to the max your machine > will accept. > > Make sure you flush your IndexWriter before attempting to index this > document. > > But I would not be surprised if this failed to solve the problem. What's > in > this massive document? Would it be possible to break it up into > smaller segments and index many sub-documents for this massive doc? > I also wonder what problem you're trying to solve by indexing this doc. > Is it a log file? I can't imagine a text document that big. That's like a > 100 volume encyclopedia, and I can't help but wonder whether your users > would be better served by indexing it in pieces. > > Best > Erick > > On Feb 4, 2008 10:25 AM, SK R <[EMAIL PROTECTED]> wrote: > > > Hi, > > I got outof memory exception while indexing huge documents (~1GB) in > > one thread and optimizing some other (2 to 3) indexes in different > > threads. > > Max JVM heap size is 512MB. I'm using lucene2.3.0. > > > > Please suggest a way to avoid this exception. > > > > Regards > > RSK > > >