Hi Tony,
Thanks for your thoughts. I found the issue in my Flink processing chain. I
had Kafka partition ids 0, 1, 2 and therefore it was a different issue. I
had a keyBy operation before my process operation (which contains my main
stream processing logic) and there was only one key being assigne
Hi Isuru,
The way to assign partitions by FlinkKafkaConsumer can refer to this java
document. (
https://ci.apache.org/projects/flink/flink-docs-release-1.3/api/java/org/apache/flink/streaming/connectors/kafka/internals/KafkaTopicPartitionAssigner.html
)
That means your partitions should have incre
Hi all,
I'm trying to implement a Flink consumer which consumes a Kafka topic with
3 partitions. I've set the parallelism of the execution environment to 3 as
I want to make sure that each Kafka partition is consumed by a separate
parallel task in Flink. My first question is whether it's always gu