Hi Javier,

it seems as if you either are missing the lucene-codecs jar in your
classpath or that you have a wrong version (not 4.10.4). Could you check in
your job jar whether it includes lucence-codecs? If so, could you run mvn
dependency:tree in the root directory of your project. There you should see
which version of lucene-codecs you have included and from where it stems.

Cheers,
Till
​

On Tue, Jan 12, 2016 at 11:55 AM, Lopez, Javier <javier.lo...@zalando.de>
wrote:

> Hi,
>
> We are using the sink for ElasticSearch and when we try to run our job we
> get the following exception:
>
> java.lang.ExceptionInInitializerError Caused by:
> java.lang.IllegalArgumentException: An SPI class of type
> org.apache.lucene.codecs.Codec with name 'Lucene410' does not exist.  You
> need to add the corresponding JAR file supporting this SPI to your
> classpath.  The current classpath supports the following names: []
>
> We are using embedded nodes and we don't know if we are missing some
> configuration for the elasticsearch client. This is the code we are using:
>
> Map<String, String> config = Maps.newHashMap();
>
>   config.put("bulk.flush.max.actions", "1");
>
>   config.put("cluster.name", "flink-test");
>
>
>
>   result.addSink(new ElasticsearchSink<>(config, new
> IndexRequestBuilder<Tuple4<String, Double, Long, Double>>() {
>       @Override
>       public org.elasticsearch.action.index.IndexRequest
> createIndexRequest(Tuple4<String, Double, Long, Double> element,
> RuntimeContext ctx) {
>           Map<String, Object> json = new HashMap<>();
>           json.put("data", element);
>           return org.elasticsearch.client.Requests.indexRequest()
>                   .index("stream_test_1")
>                   .type("aggregation_test")
>                   .source(json);
>       }
>   }));
>
> The flink server as well as the elasticsearch server are in the same local
> machine.
>
> Thanks for your help
>

Reply via email to