Hello, this looks like the scala default version of linkis does not match
the scala version inside your spark causing

你好,这个看是linkis的scala默认版本和你的spark里面的scala版本不匹配导致

李超 <lichao0...@jxyl.com> 于2024年6月5日周三 15:45写道:

> 你好,在运行scriptis的时候报这个错!
>
> ●Linkis version used: 1.1.2
> ●Environment name and version:
> ○cdh-5.14.2
> ○hdp-3.1.5
> ○hive-2.1.1
> ○spark-3.2.1
> ○scala-2.12.2
> ○jdk 1.8.0_121
> 本地环境 spark:1.6.3
>
>
> Q1. 本地环境spark:1.6.3,跑sql失败
>
> 请问可以在linkis配置中指定spark运行版本吗
>
>
> 报错信息如下:
> 2024-05-29 10:45:27.045 ERROR jobRequest(IDE_linkis_spark_0) execute
> failed,21304, Task is Failed,errorMsg: errCode: 12003 ,desc: hdp03:9101_0
> Failed to async get EngineNode AMErrorException: errCode: 30002 ,desc:
> ServiceInstance(linkis-cg-engineconn, hdpxx:46539) ticketID:
> 0bf51275-9a46-4d3b-be2a-0b19d620a670 Failed to initialize engine, reason:
> Failed to start EngineConn, reason: Multiple versions of Spark are
> installed but SPARK_MAJOR_VERSION is not set
> Spark1 will be picked by default
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/appcom/tmp/engineConnPublickDir/aa5beb7a-33e1-4113-a59c-667a91afd881/v000001/lib/log4j-slf4j-impl-2.17.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/usr/hdp/2.6.5.0-292/spark/lib/spark-assembly-1.6.3.2.6.5.0-292-hadoop2.7.3.2.6.5.0-292.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See&nbsp;http://www.slf4j.org/codes.html#multiple_bindings&nbsp;for
> an explanation.
> SLF4J: Actual binding is of type
> [org.apache.logging.slf4j.Log4jLoggerFactory]
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.Product.$init$(Lscala/Product;)V
> at org.apache.linkis.common.conf.CommonVars.(CommonVars.scala:24)
> at org.apache.linkis.common.conf.CommonVars$.apply(CommonVars.scala:58)
> at org.apache.linkis.common.conf.CommonVars.apply(CommonVars.scala)
> at
> org.apache.linkis.manager.label.conf.LabelCommonConfig.(LabelCommonConfig.java:25)
> at
> org.apache.linkis.manager.label.builder.factory.LabelBuilderFactoryContext.getLabelBuilderFactory(LabelBuilderFactoryContext.java:48)
> at
> org.apache.linkis.engineconn.launch.EngineConnServer$.(EngineConnServer.scala:63)
> at
> org.apache.linkis.engineconn.launch.EngineConnServer$.(EngineConnServer.scala)
> at
> org.apache.linkis.engineconn.launch.EngineConnServer.main(EngineConnServer.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> You can go to this
> path(/appcom/tmp/linkis/20240529/spark/0bf51275-9a46-4d3b-be2a-0b19d620a670/logs)
> to find the reason or ask the administrator for help ,ip: hdp03 ,port: 9101
> ,serviceKind: linkis-cg-linkismanager ,ip: hdp03 ,port: 9104 ,serviceKind:
> linkis-cg-entrance
> 2024-05-29 10:45:27.045 INFO Task creation time(任务创建时间): 2024-05-29
> 10:45:21, Task scheduling time(任务调度时间): 2024-05-29 10:45:22, Task start
> time(任务开始时间): 2024-05-29 10:45:22, Mission end time(任务结束时间): 2024-05-29
> 10:45:27
> 2024-05-29 10:45:27.045 INFO Task submit to Orchestrator time:2024-05-29
> 10:45:22, Task request EngineConn time:not request ec, Task submit to
> EngineConn time:not submit to ec
> 2024-05-29 10:45:27.045 INFO Your mission(您的任务) 8 The total time spent
> is(总耗时时间为): 5.9 s
> 2024-05-29 10:45:27.045 INFO Sorry. Your job completed with a status
> Failed. You can view logs for the reason.
> 2024-05-29 10:45:27.045 INFO job is completed.
>
>
>
>
>
> 发自我的企业微信

Reply via email to