ginal Message-
> From: Alan Woodward [mailto:a...@flax.co.uk]
> Sent: Montag, 8. Juni 2015 12:23
> To: java-user@lucene.apache.org
> Subject: Re: Memory problem with TermQuery
>
> Hi Anna,
>
> In normal usage, perReaderTermState will be null, and TermQuery will be very
&g
@lucene.apache.org
Subject: Re: Memory problem with TermQuery
Hi Anna,
In normal usage, perReaderTermState will be null, and TermQuery will be very
lightweight. It's in particular expert use cases (generally after queries have
been rewritten against a specific IndexReader) that the perReaderTermState wil
Hi Anna,
In normal usage, perReaderTermState will be null, and TermQuery will be very
lightweight. It's in particular expert use cases (generally after queries have
been rewritten against a specific IndexReader) that the perReaderTermState will
be initialized. Are you cacheing rewritten queri
Are you opening/closing your searcher and writer for each document?
If so, it sounds like you're not closing all of them appropriately and
that would be the cause of your memory increase. But you shouldn't
have to do that anyway. Why not just use the same IndexReader to
search and delete all your d
tance as soon as all clients that were using it are done.
- Original Message -
From: "Chris Hostetter" <[EMAIL PROTECTED]>
To:
Sent: Wednesday, February 01, 2006 6:03 PM
Subject: Re: Memory problem
it seems like there are a few common things that bite people over
it seems like there are a few common things that bite people over and over
again that you should check first and foremost...
1) don't use more searchers/readers then you need.
Every time you open an IndexSearcher/IndexReader resources are used which
take up memory. for an application pointed a
As long as you have many document in index there can many unique terms
in index.
Every 128th term(by default) is written to term info index for faster
term lookup.
This info is loaded entirely to memory when searching so this can
increase memory usage.
Note that this does not depends on number o
Hi Nick,
we didnt get the error on importing it was actually when conducting a
search. Would this still help?
Thanks
Leon
- Original Message -
From: "Nick Vincent" <[EMAIL PROTECTED]>
To:
Sent: Wednesday, February 01, 2006 11:17 AM
Subject: RE: Memory problem
H
Hi Leon,
I had a similar problem when doing a test import which I believe was actually
down to object churn in parsing the data to create the Documents. I achieved a
quick fix by calling System.gc() every thousand documents.
Cheers,
Nick
From: Leon Chaddo