An alternative is to use a CDC tool like Debezium to stream your table
changes, and then ingest that stream using Flink to push data later to
Kafka.

On Wed, Nov 3, 2021 at 6:17 AM Guowei Ma <guowei....@gmail.com> wrote:

> Hi, Qihua
>
> AFAIK there is no way to do it. Maybe you need to implement a "new" sink
> to archive this target.
>
> Best,
> Guowei
>
>
> On Wed, Nov 3, 2021 at 12:40 PM Qihua Yang <yang...@gmail.com> wrote:
>
>> Hi,
>>
>> Our flink application has two sinks(DB and kafka topic). We want to push
>> same data to both sinks. Is it possible to push data to kafka topic only
>> after data is pushed to DB successfully? If the commit to DB fail, we don't
>> want those data is pushed to kafka.
>>
>> Thanks,
>> Qihua
>>
>

Reply via email to