hi, 看不见图,建议用图床或者填一下代码。 我看到代码中有 yaml 文件,事实上 更建议使用 ddl 来创建相应的 catalog。
best, Shengkai drewfranklin <[email protected]> 于2021年11月18日周四 下午6:01写道: > Hello, friends ! > 我按照官方文档使用 sql client 去连接hive catalog 时出错。 > 我的hive version 2.3.6 > Flink version 1.13.1 > > 感觉官方介绍的bundled 方式添加jar 包,在flink/lib 下添加如下截图的包。然后重启集群,启动了sql-client > ,连接报错如下,看报错感觉缺包,不知道缺什么包。第二种方式也尝试了下,一样的报错。似乎无法创建catalog 连接。 > Yaml 文件: > > > Reading session environment from: > file:/Users/feng/flink-1.13.1/bin/../catlog_yaml/hiveCatalog.yaml > > > Exception in thread "main" > org.apache.flink.table.client.SqlClientException: Unexpected exception. > This is a bug. Please consider filing an issue. > at > org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:201) > at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161) > Caused by: org.apache.flink.table.api.ValidationException: Unable to > create catalog 'myhive'. > > Catalog options are: > 'hive-conf-dir'='/Users/feng/hive-2.3.6/conf' > 'type'='hive' > at > org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:270) > at > org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.createCatalog(LegacyTableEnvironmentInitializer.java:217) > at > org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.lambda$initializeCatalogs$1(LegacyTableEnvironmentInitializer.java:120) > at java.base/java.util.HashMap.forEach(HashMap.java:1336) > at > org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeCatalogs(LegacyTableEnvironmentInitializer.java:117) > at > org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeSessionState(LegacyTableEnvironmentInitializer.java:105) > at > org.apache.flink.table.client.gateway.context.SessionContext.create(SessionContext.java:233) > at > org.apache.flink.table.client.gateway.local.LocalContextUtils.buildSessionContext(LocalContextUtils.java:100) > at > org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:91) > at org.apache.flink.table.client.SqlClient.start(SqlClient.java:88) > at > org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187) > ... 1 more > Caused by: org.apache.flink.table.api.TableException: Could not load > service provider for factories. > at > org.apache.flink.table.factories.FactoryUtil.discoverFactories(FactoryUtil.java:507) > at > org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:298) > at > org.apache.flink.table.factories.FactoryUtil.getCatalogFactory(FactoryUtil.java:455) > at > org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:251) > ... 11 more > Caused by: java.util.ServiceConfigurationError: > org.apache.flink.table.factories.Factory: > org.apache.flink.table.module.hive.HiveModuleFactory not a subtype > at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:589) > at > java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1237) > at > java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1265) > at > java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1300) > at > java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1385) > at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132) > at > org.apache.flink.table.factories.FactoryUtil.discoverFactories(FactoryUtil.java:503) > ... 14 more >
