Hello,
We have a technical issue with our usage of lucene that let us puzzle about the
possible source.
To specified the issue, we have an application with good time of response on
search but after a certain amount of time from a few hours to a few days the
search that were taking a few hundred
Hi,
You can still change the setting on the TokenFilter after creating it:
StopFilter#setEnablePositionIncrements(false) - this method was *not* removed!
This fails only is you pass matchVersion>=Version.LUCENE_44. Just use an older
matchVersion parameter to the constructor and you can still ena
Hi there,
The StopFilterFactory can be used to produce StopFilters with the desired
stop-words inside of it . As a constructor argument it takes a
Map and one of the valid keys you can pass inside of that is
"enablePositionIncrements" . If you don't pass that in then it defaults to
true. Is this wh
I don't think that you should use the facet module. If all you want is to
encode a bunch of numbers under a 'foo' field, you can encode them into a
byte[] and index them as a BDV. Then at search time you get the BDV and
decode the numbers back. The facet module adds complexity here: yes, you
get th