Great to hear and thanks for letting us know.
Cheers,
Till
On Wed, Dec 19, 2018 at 5:39 PM Gerard Garcia wrote:
> We finally figure it out. We had a large value in the Kafka consumer
> option 'max.partition.fetch.bytes', this made the KafkaConsumer to not
> consume at a balanced rate from all p
We finally figure it out. We had a large value in the Kafka consumer option
'max.partition.fetch.bytes', this made the KafkaConsumer to not consume at
a balanced rate from all partitions.
Gerard
I understand your problem correctly, there is a similar JIRA
>>> issue FLINK-10348, reported by me. Maybe you can take a look at it.
>>>
>>>
>>> Jiayi Liao,Best
>>>
>>> Original Message
>>> *Sender:* Gerard Garcia
>>> *Rec
t
>>
>> Original Message
>> *Sender:* Gerard Garcia
>> *Recipient:* fearsome.lucidity
>> *Cc:* user
>> *Date:* Monday, Oct 29, 2018 17:50
>> *Subject:* Re: Unbalanced Kafka consumer consumption
>>
>> The stream is partitioned by key after inge
ao,Best
>
> Original Message
> *Sender:* Gerard Garcia
> *Recipient:* fearsome.lucidity
> *Cc:* user
> *Date:* Monday, Oct 29, 2018 17:50
> *Subject:* Re: Unbalanced Kafka consumer consumption
>
> The stream is partitioned by key after ingestion at the finest granularity
Date:Monday, Oct 29, 2018 17:50
Subject:Re: Unbalanced Kafka consumer consumption
The stream is partitioned by key after ingestion at the finest granularity that
we can (which is finer than how stream is partitioned when produced to kafka).
It is not perfectly balanced but still is not so
You can always shuffle the stream generated by the Kafka source
(dataStream.shuffle()) to evenly distribute records downstream.
On Fri, Oct 26, 2018 at 2:08 AM gerardg wrote:
> Hi,
>
> We are experience issues scaling our Flink application and we have observed
> that it may be because Kafka mess
Hi,
We are experience issues scaling our Flink application and we have observed
that it may be because Kafka messages consumption is not balanced across
partitions. The attached image (lag per partition) shows how only one
partition consumes messages (the blue one in the back) and it wasn't until