day, December 6, 2023 7:21:50 PM
> *收件人:* elakiya udhayanan ; user@flink.apache.org <
> user@flink.apache.org>
> *主题:* Re: Query on using two sinks for a Flink job (Flink SQL)
>
> Hi Elakiya,
>
> You can try executing TableEnvironmentImpl#executeInternal for non-insert
&
1:50 PM
收件人: elakiya udhayanan ; user@flink.apache.org
主题: Re: Query on using two sinks for a Flink job (Flink SQL)
Hi Elakiya,
You can try executing TableEnvironmentImpl#executeInternal for non-insert
statements, then using StatementSet.addInsertSql to add multiple insertion
statetments,
Hi Elakiya,
You should use DML in the statement set instead of DQL .
Here is a simple example:
executeSql("CREATE TABLE source_table1 ..");
executeSql("CREATE TABLE source_table2 ..");
executeSql("CREATE TABLE sink_table1 ..");
executeSql("CREATE TABLE sink_table1 ..");
s
Hi Xuyang, Zhangao,
Thanks for your response, I have attached sample job files that I tried
with the Statementset and with two queries. Please let me know if you are
able to point out where I am possibly going wrong.
Thanks,
Elakiya
On Wed, Dec 6, 2023 at 4:51 PM Xuyang wrote:
> Hi, Elakiya.
>
Sent: Wednesday, December 6, 2023 17:49
To: user@flink.apache.org
Subject: Query on using two sinks for a Flink job (Flink SQL)
Hi Team,
I would like to know the possibility of having two sinks in a single Flink
job. In my case I am using the Flink SQL based job where I try to consume from
two
Hi Team,
I would like to know the possibility of having two sinks in a single Flink
job. In my case I am using the Flink SQL based job where I try to consume
from two different Kafka topics using the create table (as below) DDL and
then use a join condition to correlate them and at present write i