Hi Jaqie,

not sure whether this is easily possible with Flink's SQL API but if you
used the DataStream API directly you could create a connected stream where
you have two inputs. One input could be the normal message stream and the
other input could be the configuration stream. So whenever there is a
configuration change, you would need to stream it into your application
(e.g. by writing it to Kafka) and then the connected stream operators could
apply the configuration changes.

Cheers,
Till

On Thu, Nov 7, 2019 at 4:16 AM Jaqie Chan <jaqi...@gmail.com> wrote:

> Hello,
>
> I use Flink SQL API to process a data stream from Kafka. To process these
> data, I use some configurations loaded from an HTTP endpoint once at
> initialization.
>
> The configuration is loaded only once at job initialization. So it works
> well with a static configuration, but do not handle dynamic ones.
>
> How to handle dynamic configuration, without having to reload the
> configuration at each message?
>
> Thanks
> 嘉琪
>

Reply via email to