Problem with updating Index continuously

2008-10-19 Thread Cool The Breezer
Hi, I have requirement of updating search index and it results in creation of lots of index files as well as size is also getting increased. I create index writer with autocommit true and create false directory = FSDirectory.getDirectory(indexDir); docWriter = new IndexWriter(directory, true,

Re: Problem with updating Index continuously

2008-10-20 Thread Cool The Breezer
> You need to close the old read, only if the newReader is > different > (ie, it was in fact reopened because there were changes in > the index). I tried closing but getting "index already closed" error. IndexReader newReader = reader.reopen(); if (newReader != reader)

Order the index by timestamp field and Get n documents

2008-11-09 Thread Cool The Breezer
Hi, In my index, there is a field called timestamp which is long value of date. I am trying to get sort all documents by timestamp and get N documents. I am trying to find a way to create a query like "timestamp > 0" and then order the result by timestamp and get N fields. However I am not a

Re: Order the index by timestamp field and Get n documents

2008-11-10 Thread Cool The Breezer
;timestamp",true); Filter dupFilter = new DuplicateFilter("id"); Hits hits = searcher.search(rangeQuery,dupFilter,sort); --- On Mon, 11/10/08, Cool The Breezer <[EMAIL PROTECTED]> wrote: > From: Cool The Breezer <[EMAIL PROTECTED]> > Subject: Order the in

Re: Reopen IndexReader

2008-11-18 Thread Cool The Breezer
I had same kind of problem and I somehow managed to find a work around by initializing IndexSearcher from new reader. try { IndexReader newReader = reader.reopen(); if (newReader != reader) { // reader was reopened

Re: [ot] a reverse lucene

2008-11-23 Thread Cool The Breezer
May be RSS feed a solution. Just provide RSS feed as a search result for each query and people subscribing these RSS feed would get notifications in regular intervals. They need to install RSS clients, which can run queries in regular intervals. --- On Sun, 11/23/08, Ian Holsman <[EMAIL PROTE

Re: Proximity Search between phrases

2008-12-29 Thread Cool The Breezer
You could you phrase queries also like "Economic Meltdown" AND "Asian Countries". but these phrases may be too distant from one another to be relevant for your searching purposes. To get better result wrt position(distance between phrases), you can use SpanNearQuery. Let me know if you need mo

Similarity

2009-06-23 Thread Cool The Breezer
Of the late I started using Lucene as main search library for all documents in our intranet. It works extremely well. I am trying to use similarity kinda functionality to find similarity between two sentences/documents and trying to use Wordnet in our searching solution. I have used wordnet con

Re: Similarity

2009-06-23 Thread Cool The Breezer
ce if you have a domain specific corpus, you would need to generate some kind of Latent Semantic Index to extract the relations therein. On Tue, Jun 23, 2009 at 5:27 AM, Cool The Breezer wrote: > > Of the late I started using Lucene as main search library for all documents > in our intra

IndexWriter creates multiple .cfs files

2009-12-07 Thread Cool The Breezer
Hello Group, I am continuously updating an index and at the same time searcher also searches the index, which resulted in multiple .cfs files for each commit by IndexWriter. I am not sure whether this is an expected behavior or I need to merge each time after IndexWriter commit

Re: IndexWriter creates multiple .cfs files

2009-12-07 Thread Cool The Breezer
ou're trying to accomplish (i.e. the why). Jason On Mon, Dec 7, 2009 at 10:25 PM, Cool The Breezer wrote: > Hello Group, > I am continuously updating an index and at the same time > searcher also searches the index, which resulted in multiple .cfs files for > each

Re: IndexWriter creates multiple .cfs files

2009-12-08 Thread Cool The Breezer
merging the CFSs down, over time. Have you changed your mergeFactor? It's odd to see 100s of CFSs. Or maybe you're not closing the old reader on reopening a new one? That would prevent deletion of the files. Mike On Tue, Dec 8, 2009 at 1:43 AM, Cool The Breezer wrote: > Thanks Ja