Thanks a lot Peter. I searched this line "It’s possible to enable
rack-awareness in a rolling manner" everywhere over the internet but
coudn't find it. Thank you for your help. Have a real nice day.
On Thu, Sep 19, 2019 at 10:43 PM Peter Bukowinski wrote:
> Hi Ashu,
>
> It’s possible to ena
Hi,
I'm running a Kafka streams app(v2.1.0) with windowed function. But
after 24 hours running, local disc usage increased from 5G to 20G and
keeps increasing. From what I googled, once I introduced `windowedBy`,
it should remove old data automatically.
My topology looks like below:
stream.selec
Hello, Kafka Users,
Our brokers and clients are in 0.9.0.1, would like to upgrade our producer
and consumer clients to 2.2.1 first.
Does it matter if we upgrade the client jar from 0.9.0.1 to 2.2.1 - but the
brokers are using 0.9.0.1?
https://kafka.apache.org/23/documentation/streams/upgrade-guid
Resolution to this issue was to restart the broker which was the active
controller so that that role move to a new broker. Not clear why the
existing active controller was no longer getting re-assignment updates from
the zookeeper quorum.
On Wed, Jul 10, 2019 at 5:43 PM Devin Kramer
wrote:
> W
Hi Ashu,
It’s possible to enable rack-awareness in a rolling manner. Kafka will never
automatically move existing partitions, unless you tell it to or have a
separate tool (e.g. Cruise Control) that does it for you. Rack-awareness comes
into play when topics are initially created and partitions
Hello,
Does it matter if we upgrade the client jar from 2.1.0 to 2.3.0 - but the
brokers are using 2.2.0?
https://kafka.apache.org/23/documentation/streams/upgrade-guide
We don't use streams, just plain producer/consumer clients. For us, would
it matter at all?
Thanks,
Greetings,
We have 8 nodes Brokers setup on AWS in 2 availability zones with 2
replication. Our plan is to add one more node and distribute 3 nodes in
each AZ and change replication factor to 3.
In order to replicate data in each AZ we need to enable Rack awareness. Can
someone guide how can I ac