Hi Prasanna,
auto.create.topics.enable is only recommended for development clusters and
not in production use cases (as one programming error could potentially
flood the whole broker with a large amount of topics). I have experienced
first hand the mess it makes.
I'd suggest finding a supplemental
I think "auto.create.topics.enable" is enabled by default [1]?
Best,
Jark
[1]: https://kafka.apache.org/documentation/#auto.create.topics.enable
On Mon, 1 Jun 2020 at 19:55, Leonard Xu wrote:
> I think @brat is right, I didn’t know the Kafka property
> 'auto.create.topics.enable’ , you can pa
I think @brat is right, I didn’t know the Kafka property
'auto.create.topics.enable’ , you can pass the property to Kafka Producer, that
should work.
Best,
Leonard Xu
> 在 2020年6月1日,18:33,satya brat 写道:
>
> Prasanna,
> You might want to check the kafka broker configs where
> 'auto.create.topi
Prasanna,
You might want to check the kafka broker configs where
'auto.create.topics.enable' helps with creating a new topic whenever a new
message with non existent topic is published.
https://kafka.apache.org/documentation/#brokerconfigs
I am not too sure about pitfalls if any.
On Mon, Jun 1, 2
Hi, kumar
Sorry for missed the original question, I think we can not create topic
dynamically current, creating topic should belong to control flow rather a data
flow, and user may has some custom configurations of the topic from my
understanding. Maybe you need implement the logic of check/cre
Leaonard,
Thanks for the reply and would look into those options.
But as for the original question, could we create a topic dynamically when
required .
Prasanna.
On Mon, Jun 1, 2020 at 2:18 PM Leonard Xu wrote:
> Hi, kumar
>
> Flink support consume/produce from/to multiple kafka topics[1], in
Hi, kumar
Flink support consume/produce from/to multiple kafka topics[1], in your case
you can implement KeyedSerializationSchema(legacy interface) or
KafkaSerializationSchema[2] to make one producer instance support send data to
multiple topics. There is an ITCase you can reference[3].
Best,
Hi,
I have Use Case where i read events from a Single kafka Stream comprising
of JSON messages.
Requirement is to split the stream into multiple output streams based on
some criteria say based on Type of Event or Based on Type and Customer
associated with the event.
We could achieve the splittin