你好,可以贴一下详细的异常栈吗?
可以在 FLINK_HOME/log/ 下面看到

Best regards,
Yuxia

----- 原始邮件 -----
发件人: "aiden" <18765295...@163.com>
收件人: "user-zh" <user-zh@flink.apache.org>
发送时间: 星期一, 2023年 2 月 06日 下午 4:44:02
主题: Flink SQL使用hive dialect异常

HI

   我在使用Flink SQL Client开发hive时遇到一个问题,在设置set table.sql-dialect=hive时,会报如下错误
Flink SQL> CREATE CATALOG myhive WITH (
>   'type' = 'hive',
>   'hive-conf-dir' = '/opt/bd/flink/hive-conf'
> );

[INFO] Execute statement succeed.

Flink SQL> 
> use catalog myhive;
[INFO] Execute statement succeed.

Flink SQL> show tables;
+------------------------------------+
|                         table name |
+------------------------------------+
|                 hive_table_name|
+------------------------------------+
20 rows in set

Flink SQL> SET table.sql-dialect=hive;
[INFO] Session property has been set.

Flink SQL> show tables;
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.table.client.gateway.SqlExecutionException: Failed to parse 
statement: show tables;

Flink SQL> show tables;
[ERROR] Could not execute SQL statement. Reason:
java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.hadoop.hive.ql.exec.FunctionRegistry

Flink版本为1.15.2,hadoop版本为3.0.0,hive版本为2.1.1,lib下jar为:
flink-cep-1.15.2.jar
flink-connector-files-1.15.2.jar
flink-csv-1.15.2.jar
flink-dist-1.15.2.jar
flink-json-1.15.2.jar
flink-scala_2.12-1.15.2.jar
flink-shaded-zookeeper-3.5.9.jar
flink-sql-connector-hive-2.2.0_2.12-1.15.2.jar
flink-table-api-java-uber-1.15.2.jar
flink-table-planner_2.12-1.15.2.jar
flink-table-runtime-1.15.2.jar
log4j-1.2-api-2.17.1.jar
log4j-api-2.17.1.jar
log4j-core-2.17.1.jar
log4j-slf4j-impl-2.17.1.jar
按照官网说明,将flink-sql-connector-hive-2.2.0_2.12-1.15.2.jar包删除并添加antlr-runtime-3.5.2.jar、flink-connector-hive_2.12-1.15.2.ja及hive-exec-2.1.1.jar包后依然报同样错误,请问这个该如何解决。

回复