: As for backing up, you could try making a backup using IndexWriter's
: addIndexes method, and I imagine hard-links would work, too.
Hardlinks do in fact work quite well. This is how the Solr backup scripts
work...
http://svn.apache.org/repos/asf/incubator/solr/trunk/src/scripts/backup
-Ho
: I am indexing the price of some items in my index . I have quite a few
: number of indexes. I want to sort the items according to the price . is
: there a way to include all the indexes at one go while sorting or I have
: to do it one index at a time...
I assume you are using a MultiSearcher,
Marc -
We built our index maintenance operation to assume a breakdown would occur
in process (because it happened several times.) We exist in an environment
where "always on, always available" is a business requirement. We also do a
lot of updates on a cyclical basis (every 10 minutes), so malf
Scott Smith wrote:
I'm building an application which has to provide "real-time" searching
of emails as they come in. I have a number of search strings that I
need to apply against each email as it comes in and then do something
with the email based on which search string(s) get a hit.
My ini
Yes, it does compute these distances for all the terms for the field
specified, but only once (per IndexReader). This is where the
techniques Solr employs comes in real handy... warming up caches by
running searches and sorts before putting a index into service.
Erik
On May 12, 2
I am looking at DistanceComparatorSource class (for csutom sorting) and
looks like it calculates the distance for each record in the index and
not just the records returned from search, making the system very slow.
Is my observation correct? Are there ways to optimize this process?
Thanks,
Urv
In the best case you'd just need to remove the lock file.
Never happened to me (knock on wood), but others have reported corrupt indices.
Some, I believe, had to manually edit their segments file to get things in
order, and possibly reindex data in lost segments.
As for backing up, you could tr
Daniel,
The number of open files depends on several factors. The formula for
calculating the number of open files is in Lucene in Action. It depends on the
number of segments, index files, and indexed fields. Page 401.
Otis
- Original Message
From: Daniel Cortes <[EMAIL PROTECTED]>
Hi everyone,
Just wanted to get peoples views on an indexing issue. I gather a lot of
people have apps where indexing writes to the same index as is used by the
searcher. The thing that bothers me about this is if indexing is interrupted
(file system crash, out of disk space etc) the index be
Hi luceners, I want to know how many file descriptors you can have.
My lucene version is 1.4.3.
Now I have obtained a lot of Too many open files, I think that the
problem is produced by another thing because I work only with to indexs
and now it isn't big (10 MB).
For this reason I want to ask y
Hi
I am indexing the price of some items in my index . I have quite a few
number of indexes. I want to sort the items according to the price . is
there a way to include all the indexes at one go while sorting or I have
to do it one index at a time...
Cheers
KINNAR
DISCLAIMER:
This is true, but you'd need to optimise if you want additions to show up -
also means getting a new IndexSearcher each time which is not workable for some
Lucene applications (esp if you've pre-built filters and caches). I think the
suggestion to use the new memory class is a good one.
-
Hi,
You can add per document to index without indexing all document from scratch !
You have real time transactions in this way.
Regard,
Scott Smith <[EMAIL PROTECTED]> wrote:
I'm building an application which has to provide "real-time" searching
of emails as they come in. I have a number o
13 matches
Mail list logo