There is a setting for dynamic topic discovery
https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/connectors/table/kafka/#topic-and-partition-discovery

Best,

Denis

On Fri, Oct 15, 2021 at 12:48 AM Denis Nutiu <denis.nu...@gmail.com> wrote:

> Hi,
>
> In my experience with the librdkafka client and the Go wrapper, the
> topic-pattern subscribe is reactive. The Flink Kafka connector might behave
> similarly.
>
> Best,
> Denis
>
> On Fri, Oct 15, 2021 at 12:34 AM Preston Price <nacro...@gmail.com> wrote:
>
>> No, the topic-pattern won't work for my case. Topics that I should
>> subscribe to can be enabled/disabled based on settings I read from another
>> system, so there's no way to craft a single regular expression that would
>> fit the state of all potential topics. Additionally the documentation you
>> linked seems to suggest that the regular expression is evaluated only once
>> "when the job starts running". My understanding is it would not pick up new
>> topics that match the pattern after the job starts.
>>
>>
>> On Wed, Oct 13, 2021 at 8:51 PM Caizhi Weng <tsreape...@gmail.com> wrote:
>>
>>> Hi!
>>>
>>> I suppose you want to read from different topics every now and then?
>>> Does the topic-pattern option [1] in Table API Kafka connector meet your
>>> needs?
>>>
>>> [1]
>>> https://ci.apache.org/projects/flink/flink-docs-master/docs/connectors/table/kafka/#topic-pattern
>>>
>>> Preston Price <nacro...@gmail.com> 于2021年10月14日周四 上午1:34写道:
>>>
>>>> The KafkaSource, and KafkaSourceBuilder appear to prevent users from
>>>> providing their own KafkaSubscriber. Am I overlooking something?
>>>>
>>>> In my case I have an external system that controls which topics we
>>>> should be ingesting, and it can change over time. I need to add, and remove
>>>> topics as we refresh configuration from this external system without having
>>>> to stop and start our Flink job. Initially it appeared I could accomplish
>>>> this by providing my own implementation of the `KafkaSubscriber` interface,
>>>> which would be invoked periodically as configured by the `
>>>> partition.discovery.interval.ms` property. However there is no way to
>>>> provide my implementation to the KafkaSource since the constructor for
>>>> KafkaSource is package protected, and the KafkaSourceBuilder does not
>>>> supply a way to provide the `KafkaSubscriber`.
>>>>
>>>> How can I accomplish a period refresh of the topics to ingest?
>>>>
>>>> Thanks
>>>>
>>>>
>>>>
>
> --
> Regards,
> Denis Nutiu
>


-- 
Regards,
Denis Nutiu

Reply via email to