doc.add(new Field("contents", indexForm.getContent(),
> Field.Store.YES, Field.Index.ANALYZED));
> writer.updateDocument(new Term("id"), doc);
>
> but i've no luck , the index getting duplicate documents , how do i
> handle it? please anyone help
etting duplicate documents , how do i
handle it? please anyone help me?
--
View this message in context:
http://old.nabble.com/updating-index-tp26635841p26635841.html
Sent from the Lucene - Java Users mailing list archive at Nabble.com.
-
Mysql supports freetext search,why still stick on nutch?
2008/10/20 Michael McCandless <[EMAIL PROTECTED]>
>
> Is it possible you are closing it somewhere else?
>
> This code fragment looks correct to me.
>
> Mike
>
> Cool The Breezer wrote:
>
> You need to close the old read, only if the newRea
Is it possible you are closing it somewhere else?
This code fragment looks correct to me.
Mike
Cool The Breezer wrote:
You need to close the old read, only if the newReader is
different
(ie, it was in fact reopened because there were changes in
the index).
I tried closing but getting "inde
> You need to close the old read, only if the newReader is
> different
> (ie, it was in fact reopened because there were changes in
> the index).
I tried closing but getting "index already closed" error.
IndexReader newReader = reader.reopen();
if (newReader != reader)
You need to close the old read, only if the newReader is different
(ie, it was in fact reopened because there were changes in the index).
Not closing the old reader will cause the files it held open to be
undeletable.
Mike
Cool The Breezer wrote:
Hi,
I have requirement of updating se
Hi,
I have requirement of updating search index and it results in creation of
lots of index files as well as size is also getting increased. I create index
writer with autocommit true and create false
directory = FSDirectory.getDirectory(indexDir);
docWriter = new IndexWriter(directory, true,
BTW Erick this works brilliantly with UN_TOKENIZED. SUPER fast :)
On 2/25/07, Erick Erickson <[EMAIL PROTECTED]> wrote:
Yes, I'm pretty sure you have to index the field (UN_TOKENIZED) to be able
to fetch it with TermDocs/TermEnum! The loop I posted works like this
for each term in the ind
Daniel Noll <[EMAIL PROTECTED]> wrote on 01/03/2007 22:10:15:
> > API IndexWriter.updateDocument() may be useful.
>
> Whoa, nice convenience method.
>
> I don't suppose the new document happens to be given the same ID as the
> old one. That would make many people's lives much easier. :-)
Oh no,
Doron Cohen wrote:
Once indexing the database_id field this way, also the newly added
API IndexWriter.updateDocument() may be useful.
Whoa, nice convenience method.
I don't suppose the new document happens to be given the same ID as the
old one. That would make many people's lives much easie
Yes correct, I'll be using the new updateDocument() api call!
Erick thanks for correcting my poor use of termdocs :)
On 2/27/07, Doron Cohen <[EMAIL PROTECTED]> wrote:
"Erick Erickson" <[EMAIL PROTECTED]> wrote on 25/02/2007 07:05:21:
> Yes, I'm pretty sure you have to index the field (UN_TOK
"Erick Erickson" <[EMAIL PROTECTED]> wrote on 25/02/2007 07:05:21:
> Yes, I'm pretty sure you have to index the field (UN_TOKENIZED) to be
able
> to fetch it with TermDocs/TermEnum! The loop I posted works like this
Once indexing the database_id field this way, also the newly added
API IndexW
Yes, I'm pretty sure you have to index the field (UN_TOKENIZED) to be able
to fetch it with TermDocs/TermEnum! The loop I posted works like this
for each term in the index for the field
if this is one I want to update
use a TermDocs to get to that document and operate on it.
But
I didn't fully understand your last post and why I wanted to do
IndexReader.terms() then IndexReader.termDocs(). Won't something like this
work?
for (Business biz : updates)
{
Term t = new Term("id", biz.getId()+"");
TermDocs tDocs = reader.termDocs(t);
I think you can get MUCH better efficiency by using TermEnum/TermDocs. But I
think you need to index (UN_TOKENIZED) your primary key (although now I'm
not sure. But I'd be surprised if TermEnum worked with un-indexed data.
Still, it'd be worth trying but I've always assumed that TermEnums only
wor
I have an index where I'm storing the primary key of my database record as
an unindexed field. Nightly I want to update my search index with any
database changes / additions.
I don't really see an efficient way to update these records besides doing
something like this which I'm worried with thr
On 12/22/06, Mark Miller <[EMAIL PROTECTED]> wrote:
So the first time you do a sort, the fieldcache is
loaded up that stores the term to sort on for each document id.
Right.
The actual sorting appears to happen just like with relevancy score
sortingusing a priority queue that is loaded as
than all of the others and the only difference is the sheer number of
items in the index.
-Original Message-
From: Mark Miller [mailto:[EMAIL PROTECTED]
Sent: Thursday, December 21, 2006 2:48 PM
To: java-user@lucene.apache.org
Subject: Re: First search is slow after updating index .. subse
va-user@lucene.apache.org
Subject: Re: First search is slow after updating index .. subsequent
searches very fast
Since you say you are sorting on a field the bulk of the time will be
doing the sort and caching it (FieldCache). Subsequent searches use that
cache to avoid paying the full sort cost
search is slow after updating index .. subsequent
searches very fast
To populate FieldCache, the number of matches doesn't matter. There is
no need to be scrimy there - you don't really save anything by running a
query that matches only a few docs. Just run something that looks like
a
> Something like dd if=/path/to/index/foo.cfs of=/dev/null
Be careful not to mistaken with the 'of' argument of 'dd' - see
http://en.wikipedia.org/wiki/Dd_(Unix)
-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional command
search is slow after updating index .. subsequent
searches very fast
To populate FieldCache, the number of matches doesn't matter. There is
no need to be scrimy there - you don't really save anything by running a
query that matches only a few docs. Just run something that looks like
a com
Hi,
On Thu, 2006-12-21 at 10:21 -0800, Otis Gospodnetic wrote:
> Something like dd if=/path/to/index/foo.cfs of=/dev/null
> Basically, force the data through the kernel preemptively, so FS caches it.
> Run vmstat while doing it, and if the index hasn't been cached by the FS,
> you should see a spi
ginal Message
From: Bogdan Ghidireac <[EMAIL PROTECTED]>
To: java-user@lucene.apache.org
Sent: Thursday, December 21, 2006 3:58:52 AM
Subject: Re: First search is slow after updating index .. subsequent searches
very fast
Otis,
I am not familiar with the 'dd trick' to warm
Otis,
I am not familiar with the 'dd trick' to warm up the index. Can you please
explain it ?
Bogdan
On 12/20/06, Otis Gospodnetic <[EMAIL PROTECTED]> wrote:
To populate FieldCache, the number of matches doesn't matter. There is no
need to be scrimy there - you don't really save anything by
`dd' trick under UNIX.
Otis
- Original Message
From: Bryan Dotzour <[EMAIL PROTECTED]>
To: java-user@lucene.apache.org
Sent: Wednesday, December 20, 2006 5:23:40 PM
Subject: RE: First search is slow after updating index .. subsequent searches
very fast
One question about this,
ain,
Bryan
-Original Message-
From: Otis Gospodnetic [mailto:[EMAIL PROTECTED]
Sent: Wednesday, December 20, 2006 3:28 PM
To: java-user@lucene.apache.org
Subject: Re: First search is slow after updating index .. subsequent
searches very fast
All sounds good. Opening a new IndexReader can tak
bject: Re: First search is slow after updating index .. subsequent
searches very fast
All sounds good. Opening a new IndexReader can take a bit of time. If
you use sorting of any kind other than default sorting by relevance,
this delay on the first search is also probably caused by the lazy
Field
s slow after updating index .. subsequent searches very
fast
I'm investigating some performance issues with the way we're using
Lucene in our web app and am interested if anyone could shed some light
on what might be going on. Hopefully I can provide enough information,
please let me
I'm investigating some performance issues with the way we're using
Lucene in our web app and am interested if anyone could shed some light
on what might be going on. Hopefully I can provide enough information,
please let me know if there's more I can give.
We're using Lucene 2.0.0 and I'm curr
If I try to add documents to an index while a reader is open I get en error message
saying "Cannot delete C:\myindex\_3n.f0". I suspect that this is due to the
fact that the windows fs won't allow deletion of a file when there is a filehandler
connected to it. The solution I have at the momen
If I try to add documents to an index while a reader is open I get en error
message saying "Cannot delete C:\myindex\_3n.f0". I suspect that this is due to
the fact that the windows fs won't allow deletion of a file when there is a
filehandler connected to it. The solution I have at the moment i
My approach, which I think is common, is to use Quartz sheduler.
Chris
-
Instant Lucene Search on Any Databases/Applications
http://www.dbsight.net
On 6/12/06, Van Nguyen <[EMAIL PROTECTED]> wrote:
I've been playing around with Lucene for a while now. I'm pretty
com
I've been playing around with Lucene for a while now. I'm pretty
comfortable with creating an index and searching against it. Up until
now, I've been using the LuceneIndexAccessor package contributed by Maik
Schreiber and that's working well for me.
Now the next obstacle is to figure out wh
On 5/3/06, Kiran Joisher <[EMAIL PROTECTED]> wrote:
I m working on a project where I will use lucene to make a search engine on
a database. I am new to lucene. I wrote a test program which indexes a table
and searches the same.. but now I m stuck on how to update the index in case
a database chan
Thanks Stephen,
This was really helpful.
Cheers,
--Kiran
-Original Message-
From: Stephen Gray [mailto:[EMAIL PROTECTED]
Sent: Thursday, May 04, 2006 4:11 AM
To: java-user@lucene.apache.org
Subject: Re: Updating index if there is a database changes
Hi Kirin,
Once you've up
Hi Kirin,
Once you've updated an index using IndexWriter or IndexReader you just need
to close and re-open your IndexSearcher so that searching includes the
changes. There is a great library callled LuceIndexAccessor at the link
below that manages this for you. It creates an IndexReader/Writer
My approach is to select documents ordered by updated_date desc
And only process documents newer than the ones already in the index.
Chris Lu
Full-Text Lucene Search for Any Databases/Applications
http://www.dbsight.net
On 5/3/06, Kiran Joisher <[EMAIL PROTE
Hi all,
I m working on a project where I will use lucene to make a search engine on
a database. I am new to lucene. I wrote a test program which indexes a table
and searches the same.. but now I m stuck on how to update the index in case
a database change occurs.. I need some help on this topic...
Revati,
This sounds like a Hibernate problem, I suggest you refer to their
documentation and forum.
-Grant
revati joshi wrote:
Hi,
i hve tried updating lucene index using Hibernate lifecycle class but
not able to get the implementation of this class.
www.hibernate.org - Using Lifec
Hi,
i hve tried updating lucene index using Hibernate lifecycle class but
not able to get the implementation of this class.
www.hibernate.org - Using Lifecycles and Interceptors to update Lucene
searches.htm
The onSave(),onUpdate() method has got the Session parameter which is to
pass
On Jul 24, 2005, at 12:17 PM, Harini Raghavan wrote:
Hi All,
I am trying to add paging functionality while using lucene search.
I have created a PageFilter what takes in the current page num and
the number of records as input and invoking the IndexSearcher
passing the a Boolean Query obj
Hi All,
I am trying to add paging functionality while using lucene search. I have
created a PageFilter what takes in the current page num and the number of
records as input and invoking the IndexSearcher passing the a Boolean Query
object and the PageFilter. The search returns around 1000 reco
try that out.
Thanks,
Harini
- Original Message -
From: "Chris Hostetter" <[EMAIL PROTECTED]>
To:
Sent: Monday, July 18, 2005 1:14 PM
Subject: Re: Index locked exception while updating index
I freely admit that i wasn't paying much attention to the begining of
make managing a
singleton IndexWriter really easy.
: Date: Mon, 18 Jul 2005 10:12:39 +0530
: From: Harini Raghavan <[EMAIL PROTECTED]>
: Reply-To: java-user@lucene.apache.org
: To: [EMAIL PROTECTED]
: Cc: java-user@lucene.apache.org
: Subject: Re: Index locked exception while updating in
synchronized
and that fixed the problem.
Thanks for your quick response,
Harini
- Original Message -
From: "Otis Gospodnetic" <[EMAIL PROTECTED]>
To:
Sent: Monday, July 18, 2005 10:03 AM
Subject: Re: Index locked exception while updating index
Harini,
Harini,
You are catching IOException in the finally block, but you are not even
printing out the exception stack trace. Perhaps you are not able to
close your IndexWriter for some reason.
Otis
--- Harini Raghavan <[EMAIL PROTECTED]> wrote:
> Hi All,
> I am quite new to Lucene and I have probl
Hi All,
I am quite new to Lucene and I have problem with locking. I have a
MessageDrivenBean that sends messages to my Lucene indexer whenever there is
a new database update. The indexer updates the index incrementally . Below
is the code fragment in the indexer method that gets invoked by the
pashupathinath writes:
>how can i traverse through the values stored in the
> index and make sure that the new records are not
> duplicated ? once i encounter the duplicate primary
> key, i should be able to delete all the various fields
> values associated with that primary key.
>
There's
On Friday 08 April 2005 07:42, pashupathinath wrote:
> hi,
> i've created an index for database records. the
> problem is whenever i'm trying to update the database,
> i mean adding or deleting records from the database i
> want the index to be updated too.
>right now, i am adding new documen
hi,
i've created an index for database records. the
problem is whenever i'm trying to update the database,
i mean adding or deleting records from the database i
want the index to be updated too.
right now, i am adding new documents to the
existing index whenever i add new records to the
databa
51 matches
Mail list logo