On 2021/01/07 14:36, BELGHITH Amira (EXT) wrote:
--> Our processing System is supposed to continue streaming data even though there is some Kafka errors, we are expecting that the KafkaConsumer fails but not the Flink job, do you think it is possible?

I'm afraid that's not possible with Flink right no. We treat all exceptions as errors, which lead to job restarts and eventually complete job failure if the restarts exceed the configured limit.

What you could do right now is copy the code for the `FlinkKafkaConsumer` and insert exception handling code for the exceptions that you would like to exclude. You could even go so far and add generic handling code that you can then configure with a list of exceptions to ignore when creating the consumer.

I hope that helps!

Best,
Aljoscha

Reply via email to