Don't commit after adding each and every document.
On Tue, Sep 3, 2013 at 7:20 AM, nischal reddy wrote:
> Hi,
>
> Some more update on my progress,
>
> i have multithreaded indexing in my application, i have used thread pool
> executor and used a pool size of 4 but had a very slight increase in
Hi,
Some more update on my progress,
i have multithreaded indexing in my application, i have used thread pool
executor and used a pool size of 4 but had a very slight increase in the
performace very negligible, still it is taking around 20 minutes of time to
index around 30k files,
Some more inf
Hi Eric,
I have commented out the indexing part (indexwriter.addDocument()) part in
my application and it is taking around 90 seconds, but when i uncomment the
indexing part it is taking lot of time.
My machine specs are
windows 7, intel i7 processor, 4gb ram and doest have an ssd harddisk.
can
Hi,
Lucene's IndexWriter can safely accept updates coming from several
threads, just make sure to share the same IndexWriter instance across
all threads, no extrenal locking is necessary.
30 minutes sound slike a lot for 3 files unless they are large.
You can have a look at
http://wiki.apache
Stop. Back up. Test.
The very _first_ thing I'd do is just comment out the bit that
actually indexes the content. I'm guessing you have some
loop like:
while (more files) {
read the file
transform the data
create a Lucene document
index the document
}
Just comment out the "index
bq: I think you might be referring to the close method not present in catch
method
This section of code will close iwcTemp1 in the "if" clause but not in the
"else" clause even if there are no exceptions thrown. And if there _are_
any exceptions thrown, as other's pointed out, then the iwcTemp1 _n
Hi,
I am thinking to make my lucene indexing multi threaded, can someone throw
some light on the best approach to be followed for achieving this.
I will give short gist about what i am trying to do, please suggest me the
best way to tackle this.
What am i trying to do?
I am building an index fo
Hi All,
I was able to figure out what was going wrong, i had initialised the input
in the constructor, which will be called only once, since the same instance
will be reused, hence even for subsequent files it was using the old reader
instance.
i have refactored my code to use the input (reader i
Hi,
LockObtainFailed is *always* caused by a missing close of the IndexWriter. The
code is ununderstandable and things like the evil Eclipse
"automated-printStackTrace-catch-block" make it obvious that the code is not
designed correctly. To write good code, please *disable* this automatic featu
Hi,
I have created a custom analyzer, with a custom tokenizer which takes Antlr
tokens from a file and will convert it into lucene tokens by setting them
to attribute source.
It works fine if i add one document to index, i am able to search through a
query and getting the hits.
Problem comes whe
Hello..
I think you might be referring to the close method not present in catch
method. if that's so, that is purposely done for the time being...
Else if the execution happens properly, then the flow won't go to catch
as of now. Obviously the close will be present both in catch and finally
11 matches
Mail list logo