Re: Looking For Tokenizer With Custom Delimeter

2018-01-08 Thread Armins Stepanjans
dler > Achterdiek 19, D-28357 Bremen > http://www.thetaphi.de > eMail: u...@thetaphi.de > > > -Original Message- > > From: Armins Stepanjans [mailto:armins.bagr...@gmail.com] > > Sent: Monday, January 8, 2018 2:09 PM > > To: java-user@lucene.apache.org

RE: Looking For Tokenizer With Custom Delimeter

2018-01-08 Thread Uwe Schindler
9, D-28357 Bremen http://www.thetaphi.de eMail: u...@thetaphi.de > -Original Message- > From: Armins Stepanjans [mailto:armins.bagr...@gmail.com] > Sent: Monday, January 8, 2018 2:09 PM > To: java-user@lucene.apache.org > Subject: Re: Looking For Tokenizer With Custom Deli

Re: Looking For Tokenizer With Custom Delimeter

2018-01-08 Thread Armins Stepanjans
Thanks for the solution, however I am unable to access CharTokenizer class, when I import it using: import org.apache.lucene.analysis.util.*; Although I am able to access classes directly under analysis (or analysis.standard) just fine with the import statement: import org.apache.lucene.analysis.

RE: Looking For Tokenizer With Custom Delimeter

2018-01-08 Thread Uwe Schindler
Moin, Plain easy to do customize with lambdas! E.g., an elegant way to create a tokenizer which behaves exactly as WhitespaceTokenizer and LowerCaseFilter is: Tokenizer tok = CharTokenizer.fromSeparatorCharPredicate(Character::isWhitespace, Character::toLowerCase); Adjust with Lambdas and you