linkis 1.0.3 版本
spark spark-2.4.5-bin-hadoop2.7
出现报错内容:Job with execId-IDE_admin_spark_8 and subJobId : 9 from orchestrator
completed with state ErrorExecuteResponse(21304, Task is Failed,errorMsg:
errCode: 12003 ,desc: hadoop1:9101_8 Failed to async get EngineNode
RMErrorException: errCode:
Hello, does the corresponding Yarn queue name not exist? You can give it to
the log of linkisManager.、
你好,是不是对应的Yarn队列名不存在呢,可以给下linkisManager的日志下
肖培栋 于2022年7月26日周二 19:43写道:
> linkis 1.0.3 版本
> spark spark-2.4.5-bin-hadoop2.7
>
> 出现报错内容:Job with execId-IDE_admin_spark_8 and subJobId : 9 from
> or
Conference Theme??Apache Linkis(incubating) bi-weekly meeting 2022-07-27
Meeting Time??2022/07/27 19:30-21:00 (GMT+08:00) China Standard Time -
Beijing
Click the link to join the meeting, or add to the meeting list??
https://meeting.tencent.com/dm/cE6SdQIoSDEJ
#Tencent Conference??396-220-298
Meeti
Conference Theme:Apache Linkis(incubating) bi-weekly meeting 2022-07-27
Meeting Time:2022/07/27 19:30-21:00 (GMT+08:00) China Standard Time -
Beijing Click the link to join the meeting, or add to the meeting
list: https://meeting.tencent.com/dm/cE6SdQIoSDEJ #Tencent
Conference:396-220-298 Meeting H
Thanks, I will participate online
hong luo 于2022年7月26日周二 19:49写道:
> Conference Theme:Apache Linkis(incubating) bi-weekly meeting 2022-07-27
> Meeting Time:2022/07/27 19:30-21:00 (GMT+08:00) China Standard Time -
> Beijing Click the link to join the meeting, or add to the meeting
> list: https://
json4s jar不能直接替换,替换json4s jar 后需要重新编译。
mvn -N install mvn clean install
肖培栋 于2022年7月26日周二 19:43写道:
> linkis 1.0.3 版本
> spark spark-2.4.5-bin-hadoop2.7
>
> 出现报错内容:Job with execId-IDE_admin_spark_8 and subJobId : 9 from
> orchestrator completed with state ErrorExecuteResponse(21304, Task is
> Fa
Hello Apache Linkis PPMC and Community,
I drafted the linkis poding report for August2022, please help me to
check it and reply directly if you have any questions.
---
## Linkis
Apache Linkis is a computation middleware project, which decouples the
upper applications and the underlying data