Hi,
I have the same problem
for(String str2:split)
{
TokenStream stream =
analyzer.tokenStream("field", new
StringReader(str2));
CharTermAttribute termAtt =
Hi Uwe,
Thanks for your immediate response and sorry for my late reply. I managed to
solve my problem. Your comment was enough to "guide" me in the right
direction.
The problem was indeed inside my custom Analyzers/Tokenizers. The key point
here is that createComponents() is called only once whil
Uwe
-
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: u...@thetaphi.de
> -Original Message-
> From: zzT [mailto:zis@gmail.com]
> Sent: Thursday, July 11, 2013 9:31 AM
> To: java-user@lucene.apache.org
> Subject: Lucene 4.0 tokenstre
Hi all,
I'm migrating from Lucene 3.6.1 to 4.3.1 and there seems to be a major
change in how analyzers work
Given the code example below (which is almost copied from
http://lucene.apache.org/core/4_3_1/core/index.html)
@Test
public void testAnalysis() throws IOException {
final Strin