what you think. For now I can get away with changing the
max.poll.interval.ms
AbstractCoordinator #337
int joinGroupTimeoutMs = Math.max(this.client.defaultRequestTimeoutMs(),
this.rebalanceConfig.rebalanceTimeoutMs + 5000);
Thanks,
Tony
On Wed, Jul 14, 2021 at 10:56 PM Tony John wrote:
>
Tube]
> <https://youtube.com/confluent>
> [image: Kafka Summit] <https://www.kafka-summit.org/>
>
>
> On Wed, Jul 14, 2021 at 7:21 PM Tony John
> wrote:
>
>> Can someone help me on this.
>>
>> Thanks,
>> Tony
>>
>> On Fri, Jul 9, 202
Can someone help me on this.
Thanks,
Tony
On Fri, Jul 9, 2021 at 8:15 PM Tony John wrote:
> Hi All,
>
> I am trying to upgrade my Kafka streams application to 2.7.1 version of
> Kafka. The brokers are upgraded to 2.7.1 and kafka dependencies are also on
> 2.7.1. But w
Hi All,
I am trying to upgrade my Kafka streams application to 2.7.1 version of
Kafka. The brokers are upgraded to 2.7.1 and kafka dependencies are also on
2.7.1. But when I start the application, rebalance is failing with the
following message
Rebalance failed. org.apache.kafka.common.errors.Dis
gt; as well.
>
> Guozhang
>
>
> On Mon, Oct 19, 2020 at 9:19 PM Tony John
> wrote:
>
> > Thanks Matthias.. I don't think I will be able to take it up.. Will wait
> > for it to be available in future. :)
> >
> > On Mon, Oct 19, 2020 at 11:56 PM
ou _could_ do, is to pickup the ticket yourself (to get
> the feature maybe in 2.8 release). Not sure if you would be interested
> to contribute :)
>
>
> -Matthias
>
> On 10/19/20 11:08 AM, Tony John wrote:
> > Thanks for the quick response Matthias. We were planning to move to
king
> on it.
>
>
> -Matthias
>
> On 10/19/20 9:20 AM, Tony John wrote:
> > Hi All,
> >
> > I have been trying to figure out some documentation on how to enable rack
> > awareness on a Kafka Streams app. I do see the broker.rack configuration
> > which needs to be
Hi All,
I have been trying to figure out some documentation on how to enable rack
awareness on a Kafka Streams app. I do see the broker.rack configuration
which needs to be done at the broker side and client.rack configuration for
the consumers. Is there any specific configuration which is require
Hi All,
I was trying to switch to the latest version of streams (1.1.0) and started
seeing a significant drop in performance of the application. I was using
0.11.0.2 before. After doing some checks I found that the choking point was
Rocksdb flush which contributes almost 80% of the CPU time (PFA t
ary configurations to consider changing would be increasing "
> max.block.ms" and "retries"
>
> Thanks,
> Bill
>
> On Thu, Feb 22, 2018 at 8:14 AM, Tony John
> wrote:
>
> > Hi All,
> >
> > I am running into an issue with my Kafka S
Hi All,
I am running into an issue with my Kafka Streams application. The
application was running fine for almost 2 weeks, then it started throwing
the below exception which caused the threads to die. Now when I restart the
application, it dies quickly (1-2 hrs) when trying to catch up the lag.
T
search for the following line:
>
> "Initiating connection to node.."
>
> and the find out the host:port of node with id -1 .
>
> Guozhang
>
> On Tue, Feb 6, 2018 at 8:16 AM, Tony John
> wrote:
>
> > Hi Guozhang,
> >
> > Thanks for looking int
80
records. Adjusting up recordsProcessedBeforeCommit=76501
Thanks,
Tony
On Tue, Feb 6, 2018 at 3:21 AM, Guozhang Wang wrote:
> Hello Tony,
>
>
> Could you share your Streams config values so that people can help further
> investigating your issue?
>
>
> Guozhang
>
>
> On Mon,
Hi All,
I have been running a streams application for sometime. The application
runs fine for sometime but after a day or two I see the below log getting
printed continuously on to the console.
WARN 2018-02-05 02:50:04.060 [kafka-producer-network-thread | producer-1]
org.apache.kafka.clients.Net
of memory issues seems not
> related to the CommitFailed error. Do you have any stateful operations in
> your app that use an iterator? Did you close the iterator after complete
> using it?
>
>
> Guozhang
>
>
> On Tue, Nov 7, 2017 at 12:42 AM, Tony John
> wrote
LUE)
props.put(StreamsConfig.consumerPrefix(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG),
3)
streams = KafkaStreams(builder, StreamsConfig(props))
streams.start()
Thanks,
Tony
On Thu, Nov 2, 2017 at 4:39 PM, Tony John wrote:
> Hi All,
>
> I am facing CommitFailedException in my streams application. A
Hi All,
I am facing CommitFailedException in my streams application. As per the log
I tried changing the max.poll.interval.ms and max.poll.records. But both
didn't help. PFA the full stack trace of the exception and below is the
streams configuration used. What else could be wrong?
val props = Pr
e when I checked earlier today. Anyways once again
thanks a lot for the response. I will raise a JIRA as you suggested and I
hope this isn't the case with local state stores.
Thanks,
Tony
On Wed, Oct 18, 2017 at 9:21 PM, Tony John wrote:
> Hello All,
>
> I have been trying to create
Hello All,
I have been trying to create an application on top of Kafka Streams. I am
newbie to Kafka & Kakfa streams. So please excuse if I my understanding are
wrong.
I got the application running fine on a single instance ec2 instance in
AWS. Now I am looking at scaling and ran in to some issue
19 matches
Mail list logo