Re: Hadoop_Compatability

2020-08-09 Thread C DINESH
Thanks for the response Chesnay, i will try to understand it. If I have doubts I will get back. Thanks & Regards, Dinesh. On Thu, Aug 6, 2020 at 5:11 PM Chesnay Schepler wrote: > We still offer a flink-shaded-hadoop-2 artifact that you can find on the > download page: > https://flink.apache.or

Re: Please help, I need to bootstrap keyed state into a stream

2020-08-09 Thread Tzu-Li Tai
Hi, For the NullPointerException, what seems to be happening is that you are setting NULL values in your MapState, that is not allowed by the API. Otherwise, the code that you showed for bootstrapping state seems to be fine. > I have yet to find a working example that shows how to do both > (boo

Re: Updating kafka connector with state

2020-08-09 Thread Tzu-Li (Gordon) Tai
Hi Nikola, If I remember correctly, state is not compatible between flink-connector-kafka-0.11 and the universal flink-connector-kafka. Piotr (cc'ed) would probably know whats going on here. Cheers, Gordon On Mon, Aug 10, 2020 at 1:07 PM Nikola Hrusov wrote: > Hello, > > We are trying to updat

Updating kafka connector with state

2020-08-09 Thread Nikola Hrusov
Hello, We are trying to update our kafka connector dependency. So far we have been using flink-connector-kafka-0.11 and we would like to update the dependency to flink-connector-kafka. However, when I try to restart the job with a savepoint I get the following exception: java.lang.Exception: Exce

Re: Native K8S Jobmanager restarts and job never recovers

2020-08-09 Thread Yang Wang
Hi Kevin, I think you may not set the high availability configurations in your native K8s session. Currently, we only support zookeeper HA, so you need to add the following configuration. After the HA is configured, the running job, checkpoint and other meta could be stored. When the jobmanager fa

multiple kafka topics

2020-08-09 Thread Aissa Elaffani
Hello Guys, I am working on a Flink application, in which I consume data from Apache Kafka, the data is published in three topics of the cluster, and I need to read from them, I suppose I can create three FlikKafkaConsumer constructors. The data I am consuming is in the same format {Id_sensor:, Id

[Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

2020-08-09 Thread Vijayendra Yadav
Hi Team, I am trying to stream data from kafkaconsumer using: https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html Here my KAFKA is Kerberos secured and SSL enabled. I am running my Flink streaming in yarn-cluster on EMR 5.31. I have tried to pass keytab/principal in