Hi,
Did anybody use lucene to index syslogs? What is the maximum indexing rate
that we can get to store a 200 bytes document with 14 fields?
thanks,
MSK
--
Every day brings us a sea of opportunities
ast is the data being added to your syslogs?
and on and on and on.
All you can do is set up a test to see. It shouldn't be very hard to, say,
create a small program that randomizes input and push it through the
indexing process and measure.
Best
Erick
On 1/29/07, Saravana <[EMAIL PROTE
Hi,
Is it possible to scale lucene indexing like 2000/3000 documents per
second? I need to index 10 fields each with 20 bytes long. I should be
able to search by just giving any of the field values as criteria. I need to
get the count that has same field values.
Will it be possible?
with rega
ave a huge index pretty soon
so you won't be able to do this for very long.
What are you trying to accomplish anyway?
Erick
On 2/27/07, Saravana <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> Is it possible to scale lucene indexing like 2000/3000 documents per
> second? I need to i
cene.apache.org
Date: Thu, 1 Mar 2007 10:28:07 +0200
Subject: Re: indexing performance
On Tue, Feb 27, 2007, Saravana wrote about "indexing performance":
> Hi,
>
> Is it possible to scale lucene indexing like 2000/3000 documents per
> second?
I don't know about the actual
Dear All,
I am creating an index file per day i.e it will contain data for every 24
hours (00:00:00 to 23:59:59). This time value is a field in each document.
Document has other text fields too on which users can search.
User's search criteria would be a combination of time and other field
value