Are you running in Batch? Then you probably need to write 2 SQL jobs (or
statements).
In streaming, the notion of order doesn't make much sense. But maybe I
misunderstood your use case.
On Thu, Oct 14, 2021 at 11:37 AM Francesco Guardiani <
france...@ververica.com> wrote:
> I'm not aware of any
I'm not aware of any way to control the sink order, afaik each
Table#executeInsert will generate a separate job on its own. You may be
able to hack it around by having a custom DynamicTableSink that for each
record sends it to tidb and then to kafka.
May I ask why you need that? If the notificatio