我已经参考了文章的步骤来进行编译的,目前hive sql 是没有问题,只有spark sql 在运行的时候会报错。 怀疑了protobuf-java 冲突,hadoop/saprk/hive 都是2.5.0版本,重新按此版本编译linkis 1.1.3 还是有问题。
在 2022-09-08 00:29:25,"cas...@apache.org" <cas...@apache.org> 写道: >If you are adapting to hadoop3 and you need to make changes, you can refer >to the practice summary blog post of community user > >https://linkis.staged.apache.org/zh-CN/blog/2022/08/08/linkis111-compile-integration/#linkis > >Best Regards! >Chen Xia > > >denghaibin <denghaibi...@163.com> 于2022年9月7日周三 22:56写道: > >> Hello: >> 在linkis-cli 下运行 spark-sql "show tables " 语句报错。 >> 错误信息: >> Exception while invoking call #2 >> ClientNamenodeProtocolTranslatorPB.mkdirs over hadoop06/ >> 172.30.203.217:8020. >> >> Not retrying because try once and fail. >> >> java.lang.ClassCastException: >> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$MkdirsRequestProto >> cannot be cast to org.apache.hadoop.shaded.com.google.protobuf.Message >> >> ----------------------------------------------------------------------------------------------------------------- >> 环境: >> Apache >> hadoop 3.2.4 >> spark 3.2.1 >> hive 3.1.2 >> linkis1.1.3 >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: dev-unsubscr...@linkis.apache.org >> For additional commands, e-mail: dev-h...@linkis.apache.org