Dear: Wechat group "Apache Linkis community Development group "chat records are as follows: 微信群"Apache Linkis 社区开发群"的聊天记录如下:
————— 2022-9-28 ————— hua The configuration of this file, the installation instructions do not mention to change it hua 15:33 the I don't use this version locally peacewong@WDS 15:35 It is automatically replaced in the installation script hua double-break Indeed, the database is correct hua double-break Spark is 2.3.2 hua were Error reported when starting engine hua 15:42 Searching the entire installation directory, the main two files found are still 2.4.3 hua 15:43 The IDE engine is now normal Heisenberg it In the linkis-cli-related directory, check the spark version number in the linkis-spark-submit file hua it But the node engine fails to start hua it The version of the Node engine on the screen is also correct Heisenberg it Maybe the name's not right Heisenberg it You use Linkis-CLI to execute code hua made This can't hua made This can be Heisenberg him This depends on the DSS Workflow service log? Heisenberg him Is the version number of DSS service configuration file Spark set? Heisenberg 15:47 1. An error is reported during linkis-cli-spark-submit execution. Check the spark version in the linkis-cli-spark-submit script 2. If an error occurs when the DSS script executes Spark, check the Spark version number corresponding to the database and the plug-in directory 3. An error occurs during the DSS workflow trial. Check the Spark version number in the DSS configuration file hua 15:49 Submit that script is wrong, let me try to change it Heisenberg now I don't think this side will replace the version you modified, so you can consider raising a PR hua now Let me just change it and see if I can solve the problem 华 15:32 这个文件的配置, 安装说明好像没提到要改吧 华 15:33 我本地不是用的这个版本 peacewong@WDS 15:35 安装脚本里面自动替换的 华 15:40 确实, 这个数据库里是对 的 华 15:40 我这里spark是 2.3.2 华 15:41 现在启动引擎报错 华 15:42 整个安装目录搜索, 主要发现这两个文件还是2.4.3 华 15:43 ide的引擎正常了 海森堡 15:44 linkis-cli 相关目录下,看下linkis-spark-submit 文件中spark版本号 华 15:44 但是这个node的引擎无法启动 华 15:44 界面上看node的引擎版本也是对的 海森堡 15:44 名称可能不太对哈 海森堡 15:44 你截图使用linkis-cli执行的code吧 华 15:45 这个不行 华 15:45 这个可以 海森堡 15:46 这个得看DSS 的workflow服务的日志? 海森堡 15:46 看下dss 服务配置文件spark的版本号有设置嘛? 海森堡 15:47 1. linkis-cli-spark-submit执行报错,检查linkis-cli-spark-submit脚本中spark版本号 2. dss script执行spark报错,检查数据库对应的spark版本号,和插件目录 3. dss 工作流试运行报错,检查dss配置文件中spark版本号 华 15:49 submit那个脚本是错误的, 我改下试试 海森堡 15:50 我看这边 是不会替换你修改的版本的,可以考虑提个PR呀 华 15:50 我先改下, 看是否解决问题 -- Best Regards ------ 康悦 ritakang GitHub:Ritakang0451 E-mail:rita0...@163.com