I hacked up the test a bit so it would compile against 9.0 and confirmed
the problem existed there as well.
So going back a little farther with some manual bisection (to account for
the transition from ant to gradle) lead me to the following...
# first bad commit: [2719cf6630eb2bd7cb37d0e8462
With some trial and error I realized two things...
1) the order of the terms in the BooleanQuery seems to matter
- but in terms of their "natural order", not the order in the doc
(which is why i was so confused trying to reproduce it)
2) the problem happens when using termVectors but
I've got a user getting java.lang.IndexOutOfBoundsException from the
UnifiedHighlighter in Solr 9.1.0 w/Lucene 9.3.0
(And FWIW, this same data, w/same configs, in 8.11.1, purportedtly didn't
have this problem)
I don't really understand the highlighter code very well, but AFAICT:
- Defaul
Thank you for your help Michael. I've solved the problem by new creation of the
index.
The OutOfErrorException killed the thread, which was responsible for index
maintenance.
So the index recreation failed without an error message. So after recreating
the index,
the problem is solved.
Sorry for
When I run checkIndex on your index, I see a new exception:
org.apache.lucene.index.CorruptIndexException: Incompatible format
version: 119865344 expected 1 or lower
at org.apache.lucene.index.FieldsReader.(FieldsReader.java:116)
at
org.apache.lucene.index.SegmentReader.initialize
Instead of ignoring the exceptions in your finally clause, can you log
them? It could be something interesting is happening in there...
I'll have a look at the index.
Mike
"René Zöpnek" wrote:
> Thanks for your answer, Mike.
>
> Unfortunately I have no direct access to the server with the corr
Thanks for your answer, Mike.
Unfortunately I have no direct access to the server with the corrupt index. So
changing the creation process of the index is not possible.
I've uploaded the index to http://drop.io/hlu53sl (9 MB).
Here is the code for creating the index:
public static void crea
Something appears to be wrong with your _X.tii file (inside the
compound file).
Can you post the code that recreates this broken index?
Since it appears to be repeatable, could you regenerate your index
with compound file off, confirm the problem still happens, and then
post the _X.tii f
Hello,
I'm using Lucene 2.3.2 and had no problems untill now.
But now I got an corrupt index. When searching, a java.lang.OutOfMemoryError is
thrown. I've wrote the following test program:
private static void search(String index, String query) throws
CorruptIndexException, IOException, ParseEx
It looks like your stored fields file (_X.fdt) is corrupt.
Are you using multiple threads to add docs?
Can you try switching to SerialMergeScheduler to verify it's reproducible?
When you hit this exception, can you stop Solr and then run Lucene's
CheckIndex tool (org.apache.lucene.index.CheckInd
(switching to java-user)
OK, that's great that it's so reproducable.
To rule out a JVM bug, it would be great if you could try out Sun's
1.6.0_03 to see if it still happens.
-Yonik
On Thu, Aug 14, 2008 at 10:18 PM, Ian Connor <[EMAIL PROTECTED]> wrote:
> I seem to be able to reproduce this very e
I have an index of roughly 2 million docs making up almost 200GB and I
can't seem to merge any additional indexes into it. Here is the error
I continuously get, always with the Index: 85, Size: 13
I couldn't find much in the previous mailing list posts nor on 'ol
faithful Google.
help/ideas?
jav
Hello,
we're running into strange Lucene problems here right now: Occassionally
certain lists of hits do not build, but we end with an Exception. See
below. While this error appears often, it is not determinstic, i.e. you
repeat the identical search with the same result, you might get no such
exc
13 matches
Mail list logo