Hi!

Executing a set of statements with SQL client is supported since Flink 1.13
[1]. Please consider upgrading your Flink version.

[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/sqlclient/#execute-a-set-of-sql-statements

方汉云 <fanghan...@shizhuang-inc.com> 于2021年11月1日周一 下午8:31写道:

> Hi,
>
>
> I used offical flink-1.12.5 package,configuration
> sql-client-defaults.yaml,run bin/sql-client.sh embedded
>
>
> cat conf/sql-client-defaults.yaml
>
> catalogs:
>
> # A typical catalog definition looks like:
>
>   - name: myhive
>
>     type: hive
>
>     hive-conf-dir: /apps/conf/hive
>
>     default-database: default
>
>
> How to solve?
>
> 在 2021年11月1日 18:32,Jingsong Li<jingsongl...@gmail.com> 写道:
>
> Hi,
>
> If you are using sql-client, you can 
> try:https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sqlclient/#execute-a-set-of-sql-statements
> If you are using TableEnvironment, you can try statement set 
> too:https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/common/#translate-and-execute-a-query
>
> Best,
> Jingsong
>
> On Fri, Oct 29, 2021 at 7:01 PM Jake <ft20...@qq.com> wrote:
> >
> > Hi
> >
> > You can use like this:
> >
> > ```java
> >
> > val calciteParser = new 
> > CalciteParser(SqlUtil.getSqlParserConfig(tableEnv.getConfig))
> > sqlArr
> >     .foreach(item => {
> >         println(item)
> >         val itemNode = calciteParser.parse(item)
> >
> >         itemNode match {
> >             case sqlSet: SqlSet => {
> >                 configuration.setString(sqlSet.getKeyString, 
> > sqlSet.getValueString)
> >             }
> >             case _: RichSqlInsert => insertSqlBuffer += item
> >             case _ => {
> >                 println(item)
> >                 val itemResult = tableEnv.executeSql(item)
> >                 itemResult.print()
> >             }
> >         }
> >     })
> >
> > // execute batch inserts
> > if (insertSqlBuffer.size > 0) {
> >     insertSqlBuffer.foreach(item => {
> >         println("insert sql: " + item)
> >         statementSet.addInsertSql(item)
> >     })
> >     val explain = statementSet.explain()
> >     println(explain)
> >     statementSet.execute()
> > }
> >
> >
> >
> > ```
> >
> >
> > On Oct 29, 2021, at 18:50, wx liao <liaowx8...@gmail.com> wrote:
> >
> > Hi:
> > I use flink sql,and run a script that has one souce an two sink,I can see 2 
> > jobs runing through webUI,is that normal?
> > Can some way to ensure only run on job that has one source and two sink? 
> > Thank you
> >
> >
>
>
> --
> Best, Jingsong Lee
>
>
> 重要声明:此邮件中包含的信息为特许和保密信息,只能用于上述收件人以及其他已获得接收授权的收件人。如果您不是此邮件的预期收件人,请勿阅读、复制、转发或存储此邮件。如果已误收此邮件,请将其转发到发件人,并从您的计算机系统彻底删除此邮件。感谢您。
>
>

Reply via email to