Hi Ahmed,
If you have the logic to identify the destination cluster along with the
target topic, you will be able to achieve this with the above solution.
1. Create one kafka producer for each cluster. If 10 clusters are there,
create 10 producers.
2. Add a new attribute called 'clusterId' or so
Thank you, Ejaskhan.
I think your suggestion would only work if all the topics were on the same
Kafka cluster. In my use-case, the topics can be on different clusters, which
is why I was thinking of rolling a custom sink that detects config changes and
instantiates Kafka producers on demand as
Hi Ahmed,
If you want to dynamically produce events to different topics and you have
the logic to identify the target topics, you will be able to achieve this
in the following way.
- Suppose this is your event after the transformation logic(if any) :
EVENT.
- This is the target topic f
Hello everyone,
I have a use-case where I need to have a Flink application produce to a
variable number of Kafka topics (specified through configuration), potentially
in different clusters, without having to redeploy the app. Let's assume I
maintain the set of destination clusters/topics in conf