On Fri, 2006-09-29 at 11:50 +0200, karl wettin wrote:
> I don't consider a 300M to be a fairly small index.
Oups. I /do/ think it is.
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED
On Thu, 2006-09-28 at 10:05 +0100, Rob Young wrote:
>
> > total file system size of the index?
> segments31b
> deletable4b
> index 286Mb
If you experience that a 300M index is much slower than a.. 30M or so,
then something is probably rotten. I don't consider a 300M to be a
fairly s
On Wednesday 27 September 2006 18:51, Erik Hatcher wrote:
> Lots of possible issues, but we need more information to troubleshoot
> this properly.
> How big is your index, number of documents?
CDs 137,390
DVDs 41,049
Games 3,360
Books 648,941
Total 830,740
> total fi
I'd ask for more details. You say that you've narrowed it down to Lucene
doing the searching But which part of the search? Here're two places
people have run into problems before (sorry if you already know this...).
1> Iterating through the entire returned set with Hits.doc(#).
2> opening and
Lots of possible issues, but we need more information to troubleshoot
this properly. How big is your index, number of documents? total
file system size of the index? is your index optimized? how often
do you update the index? how are you managing indexsearcher
instances after the inde
Hi,
I'm using Lucene to search a product database (CDs, DVDs, games and now
books). Recently that index has increased in size to over a million items
(added books). I have been performance testing our search server and the
throughput of requests has dropped significantly, profiling the server i