Michael McCandless [mailto:luc...@mikemccandless.com]
> Gesendet: Do 25.06.2009 13:13
> An: java-user@lucene.apache.org
> Betreff: Re: OutOfMemoryError using IndexWriter
>
> Can you post your test code? If you can make it a standalone test,
> then I can repro and dig down faster.
OK it looks like no merging was done.
I think the next step is to call
IndexWriter.setMaxBufferedDeleteTerms(1000) and see if that prevents
the OOM.
Mike
On Thu, Jun 25, 2009 at 7:16 AM, stefan wrote:
> Hi,
>
> Here are the result of CheckIndex. I ran this just after I got the OOError.
>
> OK [4
rays, I will need some
>> more time for this.
>>
>> Stefan
>>
>> -Ursprüngliche Nachricht-
>> Von: Michael McCandless [mailto:luc...@mikemccandless.com]
>> Gesendet: Mi 24.06.2009 17:50
>> An: java-user@lucene.apache.org
>> Betreff: Re: OutO
time for this.
>
> Stefan
>
> -Ursprüngliche Nachricht-
> Von: Michael McCandless [mailto:luc...@mikemccandless.com]
> Gesendet: Mi 24.06.2009 17:50
> An: java-user@lucene.apache.org
> Betreff: Re: OutOfMemoryError using IndexWriter
>
> On Wed, Jun 24, 2009 at 10:1
On Thu, Jun 25, 2009 at 3:02 AM, stefan wrote:
>>But a "leak" would keep leaking over time, right? Ie even a 1 GB heap
>>on your test db should eventually throw OOME if there's really a leak.
> No not necessarily, since I stop indexing ones everything is indexed - I
> shall try repeated runs wit
On Wed, Jun 24, 2009 at 10:23 AM, stefan wrote:
> does Lucene keep the complete index in memory ?
No.
Certain things (deleted docs, norms, field cache, terms index) are
loaded into memory, but these are tiny compared to what's not loaded
into memory (postings, stored docs, term vectors).
> As s
On Wed, Jun 24, 2009 at 10:18 AM, stefan wrote:
>
> Hi,
>
>
>>OK so this means it's not a leak, and instead it's just that stuff is
>>consuming more RAM than expected.
> Or that my test db is smaller than the production db which is indeed the case.
But a "leak" would keep leaking over time, right?
: Sudarsan, Sithu D. [mailto:sithu.sudar...@fda.hhs.gov]
Gesendet: Mi 24.06.2009 16:18
An: java-user@lucene.apache.org
Betreff: RE: OutOfMemoryError using IndexWriter
When the segments are merged, but not optimized. It happened at 1.8GB to our
program, and now we develop and test in Win32 but run the
apache.org
Betreff: RE: OutOfMemoryError using IndexWriter
Hi Stefan,
Are you using Windows 32 bit? If so, sometimes, if the index file before
optimizations crosses your jvm memory usage settings (if say 512MB),
there is a possibility of this happening.
Increase JVM memory settings if that i
Hi Stefan,
Are you using Windows 32 bit? If so, sometimes, if the index file before
optimizations crosses your jvm memory usage settings (if say 512MB),
there is a possibility of this happening.
Increase JVM memory settings if that is the case.
Sincerely,
Sithu D Sudarsan
Off: 301-796-2587
On Wed, Jun 24, 2009 at 7:43 AM, stefan wrote:
> I tried with 100MB heap size and got the Error as well, it runs fine with
> 120MB.
OK so this means it's not a leak, and instead it's just that stuff is
consuming more RAM than expected.
> Here is the histogram (application classes marked with --
Hi Stefan,
While not directly th source of your problem, I have a feeling you are
optimizing too frequently (and wasting time/CPU by doing so). Is there a
reason you optimize so often? Try optimizing only at the end, when you know
you won't be adding any more documents to the index for a whi
) 3268608 (size)
>
> Well, something I should do differently ?
>
> Stefan
>
> -Ursprüngliche Nachricht-
> Von: Michael McCandless [mailto:luc...@mikemccandless.com]
> Gesendet: Mi 24.06.2009 10:48
> An: java-user@lucene.apache.org
> Betreff: Re: OutOfMemory
How large is the RAM buffer that you're giving IndexWriter? How large
a heap size do you give to JVM?
Can you post one of the OOM exceptions you're hitting?
Mike
On Wed, Jun 24, 2009 at 4:08 AM, stefan wrote:
> Hi,
>
> I am using Lucene 2.4.1 to index a database with less than a million records
14 matches
Mail list logo