No, the topic-pattern won't work for my case. Topics that I should
subscribe to can be enabled/disabled based on settings I read from another
system, so there's no way to craft a single regular expression that would
fit the state of all potential topics. Additionally the documentation you
linked seems to suggest that the regular expression is evaluated only once
"when the job starts running". My understanding is it would not pick up new
topics that match the pattern after the job starts.


On Wed, Oct 13, 2021 at 8:51 PM Caizhi Weng <tsreape...@gmail.com> wrote:

> Hi!
>
> I suppose you want to read from different topics every now and then? Does
> the topic-pattern option [1] in Table API Kafka connector meet your needs?
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-master/docs/connectors/table/kafka/#topic-pattern
>
> Preston Price <nacro...@gmail.com> 于2021年10月14日周四 上午1:34写道:
>
>> The KafkaSource, and KafkaSourceBuilder appear to prevent users from
>> providing their own KafkaSubscriber. Am I overlooking something?
>>
>> In my case I have an external system that controls which topics we should
>> be ingesting, and it can change over time. I need to add, and remove topics
>> as we refresh configuration from this external system without having to
>> stop and start our Flink job. Initially it appeared I could accomplish this
>> by providing my own implementation of the `KafkaSubscriber` interface,
>> which would be invoked periodically as configured by the `
>> partition.discovery.interval.ms` property. However there is no way to
>> provide my implementation to the KafkaSource since the constructor for
>> KafkaSource is package protected, and the KafkaSourceBuilder does not
>> supply a way to provide the `KafkaSubscriber`.
>>
>> How can I accomplish a period refresh of the topics to ingest?
>>
>> Thanks
>>
>>
>>

Reply via email to