Are you running in Batch? Then you probably need to write 2 SQL jobs (or
statements).

In streaming, the notion of order doesn't make much sense. But maybe I
misunderstood your use case.

On Thu, Oct 14, 2021 at 11:37 AM Francesco Guardiani <
france...@ververica.com> wrote:

> I'm not aware of any way to control the sink order, afaik each
> Table#executeInsert will generate a separate job on its own. You may be
> able to hack it around by having a custom DynamicTableSink that for each
> record sends it to tidb and then to kafka.
>
> May I ask why you need that? If the notification system after the Kafka
> sink depends on tidb, perhaps you need a retry system there that can wait
> for tidb to ingest and process those data?
>
> On Thu, Oct 14, 2021 at 10:40 AM WuKong <wukon...@foxmail.com> wrote:
>
>> Hi all:
>>      I have two Flink SQL , the same source  from Kafka,  and one SQL
>> sink data into Tidb ,another one SQL sink Kafka to notify downstream
>> system, how can I control the sink order , I wish If source Kafka data
>> come, first sink Tidb and after that sink Kafka .
>>
>> ------------------------------
>> ---
>> Best,
>> WuKong
>>
>

Reply via email to