how do reset Broker after auto add cluster??
Hello,
It will be helpful if anyone suggests solutions.
Thanks and regards,
Gowtham S, MCA
On Tue, 7 Apr 2020 at 12:31, Gowtham S wrote:
>
> Hello,
>
> As per our team requirements, we are supposed to use and communicate
> between two different Kafka Clients, provided and managed by two separ
Hello Gowtham, do you think it is possible to elaborate on your use case?
what are you trying to solve?
On Wed, Apr 15, 2020 at 2:47 PM Gowtham S wrote:
> Hello,
>
> It will be helpful if anyone suggests solutions.
>
> Thanks and regards,
> Gowtham S, MCA
>
>
> On Tue, 7 Apr 2020 at 12:31, G
Boom, you got it, Liam! Nice debugging work.
This is a pretty big bummer, but I had to do it that way for compatibility. I
added a log message to try and help reduce the risk, but it’s still kind of a
trap.
I’d like to do a KIP at some point to consider changing the default grace
period, but
What do you mean by communication between Kafka Clients ?
A Client is either a producer or a consumer, they don't "communicate"... ?
Le mar. 7 avr. 2020 à 09:01, Gowtham S a écrit :
> Hello,
>
> As per our team requirements, we are supposed to use and communicate
> between two different Kafka Cl
Hi,
We have a kafka streams application which runs multiple instances and
consumes from a source topic.
Producers produces keyed messages to this source topic.
Keyed messages are events from different sources and each source has a
unique key.
So what essentially happens is that messages from parti
background: I want to use the flink to get the log stream once and according to
different demand to add or remove some params in each line from kafka, then
distribute to different kafka cluster or different topic.
questions: The attachment is the simply example I used to test distribute to
diff
Hey Sachin,
your observation is correct, unfortunately Kafka Streams doesn't support
adding partitions online. The rebalance could not guarantee the same key
routing to the same partition when the input topic partition changes, as
this is the upstream producer's responsibility to consistently rout
Hi Sachin,
Just to build on Boyang’s answer a little, when designing Kafka’s partition
expansion operation, we did consider making it work also for dynamically
repartitioning in a way that would work for Streams as well, but it added too
much complexity, and the contributor had some other use c
Hi,
I will look into the suggestions you folks mentioned.
I was just wondering something from just kafka point of view.
Lets say we add new partitions to kafka topics. Is there any way to
configure that only new keys get their messages added to those partitions.
Existing keys continue to add their
I see 2.5.0 is in maven central (since yesterday).
Can I assume it is officially released?
Thanks.
On Tue, Apr 14, 2020 at 11:15 AM David Arthur wrote:
> Thanks everyone! The vote passes with 7 +1 votes (4 of which are binding)
> and no 0 or -1 votes.
>
> 4 binding +1 votes from PMC members Ma
The Apache Kafka community is pleased to announce the release for Apache
Kafka 2.5.0
This release includes many new features, including:
* TLS 1.3 support (1.2 is now the default)
* Co-groups for Kafka Streams
* Incremental rebalance for Kafka Consumer
* New metrics for better operational insight
Thanks for running the release David!
-Matthias
On 4/15/20 1:15 PM, David Arthur wrote:
> The Apache Kafka community is pleased to announce the release for Apache
> Kafka 2.5.0
>
> This release includes many new features, including:
>
> * TLS 1.3 support (1.2 is now the default)
> * Co-groups f
Gary, indeed the release is official now. There are many moving parts to
the release which happen sequentially. Artifacts are generally available
between a few hours to a day before the announcement goes out.
Thanks,
David
On Wed, Apr 15, 2020 at 1:34 PM Gary Russell wrote:
> I see 2.5.0 is in
Hello,
We're trying an "indexing" of the data sources, such as Kafka topics
mostly, so that various teams and services could identify potential
existing sources of data for specific needs.
We're trying to label topics with producing teams, generic entities names
(currently topic names don't alway
Thanks for doing a fantastic job with this release, David.
On Tue, Apr 14, 2020 at 11:15 AM David Arthur wrote:
> Thanks everyone! The vote passes with 7 +1 votes (4 of which are binding)
> and no 0 or -1 votes.
>
> 4 binding +1 votes from PMC members Manikumar, Jun, Colin, and Matthias
> 1 comm
Hi, Everyone,
Here is an update on the upcoming Kafka Summit events in 2020.
1. Unfortunately, Kafka Summit London, originally planned on Apr 27/28, has
been cancelled due to COVID-19.
2. Kafka Summit Austin (Aug 24/25) is still on. The CFP (
https://events.kafka-summit.org/kafka-summit-austin-2
how do restart Broker after auto add cluster??
now version kafka can't??
if can't .so how do manual add cluster ?
use which sh command?
thank you !
18 matches
Mail list logo