"Jerome Chauvin" wrote:
> Thanks Michael for your answer, but following check of our processing, it
> appears all the updates of the index are made in a single thread.
> Actually,
> this kind of exception is thrown during a heavy batch processing. This
> processing is not multi-threaded.
Do you a
.org
Objet : Re: Lucene 2.1: java.io.IOException: Lock obtain timed out:
SimpleFSLock@
"Jerome Chauvin" <[EMAIL PROTECTED]> wrote:
> We encounter issues while updating the lucene index, here is the stack
> trace:
>
> Caused by: java.io.IOException: Lock obtain timed
"Jerome Chauvin" <[EMAIL PROTECTED]> wrote:
> We encounter issues while updating the lucene index, here is the stack
> trace:
>
> Caused by: java.io.IOException: Lock obtain timed out:
> SimpleFSLock@/data/www/orcanta/lucene/store1/write.lock
> at org.apache.
All,
We encounter issues while updating the lucene index, here is the stack trace:
Caused by: java.io.IOException: Lock obtain timed out:
SimpleFSLock@/data/www/orcanta/lucene/store1/write.lock
at org.apache.lucene.store.Lock.obtain(Lock.java:69)
at
But, locking should be fine even for this "hammering" use case (and if
it's not, that's a bug, and I'd really like to know about it!).
I have hammered over 2.5 million 5-10k docs into an index this way (a
realtime system that I had not yet added a special load call to) and had 0
problems. On
Hi Erick and Mike
Really thanks a lot for the advice... =)
I will fix my code..I'll let you guys know if any problem arises.
Many thanks and best regards ^ ^
MauReen
Michael McCandless <[EMAIL PROTECTED]> wrote: Erick Erickson wrote:
> Don't do it that way. You're ope
Erick Erickson wrote:
Don't do it that way. You're opening and closing your indexwrwiter for each
document, which is extremely wasteful. And given locking has been a source
of much discussion on this list, it's not clear that locking will withstand
this kind of hammering. You want to do something
Don't do it that way. You're opening and closing your indexwrwiter for each
document, which is extremely wasteful. And given locking has been a source
of much discussion on this list, it's not clear that locking will withstand
this kind of hammering. You want to do something like
IndexWriter writ
Hi Mike,thanks for the reply...
1.Here is the class that I use for indexing..
package edu.ntu.ce.maureen.index;
import org.apache.lucene.index.IndexWriter;
import org.apache.lucene.analysis.Analyzer;
import org.apache.lucene.analysis.standard.StandardAnalyzer;
import org.
One way to mitigate the cost of this kind of thing is to create a series of
indexes on portions of your corpus and then merge them. Say you have 10,000
documents. Create 10 separate indexes of 1,000 documents each then use
IndexWriter.addIndexes to make them all into a single index.
This pre-supp
maureen tanuwidjaja wrote:
I am indexing thousands of XML document,then it stops after indexing for
about 7 hrs
...
Indexing C:\sweetpea\wikipedia_xmlfiles\part-0\37027.xml
java.io.IOException: Lock obtain timed out: [EMAIL
PROTECTED]:\sweetpea\dual_index\DI\write.lock
java.lang
Hi,
I am indexing thousands of XML document,then it stops after indexing for
about 7 hrs
...
Indexing C:\sweetpea\wikipedia_xmlfiles\part-0\37003.xml
Indexing C:\sweetpea\wikipedia_xmlfiles\part-0\37004.xml
Indexing C:\sweetpea\wikipedia_xmlfiles\part-0\37008.xml
Indexing C:\swee
12 matches
Mail list logo