aiceflower opened a new issue, #5288:
URL: https://github.com/apache/linkis/issues/5288

   ### Search before asking
   
   - [x] I searched the [issues](https://github.com/apache/linkis/issues) and 
found no similar issues.
   
   
   ### Linkis Component
   
   linkis-engineconnn-plugin
   
   ### Steps to reproduce
   
   1. 在linkis管理台创建udf函数
   2. hive引擎启动并发模式  wds.linkis.engineconn.support.parallelism=true
   3. 提交hive任务使用创建的udf函数
   
   执行任务报错,错误如下:
   2025-11-17 19:06:38.387 [INFO ] [main                                    ] 
o.a.l.e.c.e.c.ComputationExecutorManagerImpl (192) [getExecutorByLabels] 
[JobId-] - For a single Executor EC, if an Executor exists, it will be returned 
directly
   2025-11-17 19:06:38.442 [INFO ] [main                                    ] 
o.a.l.e.h.e.HiveEngineConcurrentConnExecutor (137) [executeLine] [JobId-] - 
HiveEngineConcurrentConnExecutor Ready to executeLine: use hadoop_ind
   2025-11-17 19:06:38.444 [ERROR] [main                                    ] 
o.a.l.e.c.e.h.HiveUseDatabaseEngineHook (129) [$anonfun$tryAndError$1] [JobId-] 
- java.util.NoSuchElementException: None.get
        at scala.None$.get(Option.scala:529) ~[scala-library-2.12.17.jar:?]
        at scala.None$.get(Option.scala:527) ~[scala-library-2.12.17.jar:?]
        at 
org.apache.linkis.engineplugin.hive.executor.HiveEngineConcurrentConnExecutor.executeLine(HiveEngineConcurrentConnExecutor.scala:138)
 ~[linkis-engineplugin-hive-1.8.0.jar:1.8.0]
        at 
org.apache.linkis.engineconn.computation.executor.hook.UseDatabaseEngineHook.$anonfun$afterExecutionExecute$1(UseDatabaseEngineHook.scala:73)
 ~[linkis-computation-engineconn-1.8.0.jar:1.8.0]
        at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:51) 
~[linkis-common-1.8.0.jar:1.8.0]
        at org.apache.linkis.common.utils.Utils$.tryAndError(Utils.scala:117) 
~[linkis-common-1.8.0.jar:1.8.0]
        at 
org.apache.linkis.engineconn.computation.executor.hook.UseDatabaseEngineHook.afterExecutionExecute(UseDatabaseEngineHook.scala:51)
 ~[linkis-computation-engineconn-1.8.0.jar:1.8.0]
        at 
org.apache.linkis.engineconn.launch.EngineConnServer$.$anonfun$main$5(EngineConnServer.scala:103)
 ~[linkis-engineconn-core-1.8.0.jar:1.8.0]
        at 
org.apache.linkis.engineconn.launch.EngineConnServer$.$anonfun$main$5$adapted(EngineConnServer.scala:103)
 ~[linkis-engineconn-core-1.8.0.jar:1.8.0]
        at 
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36) 
~[scala-library-2.12.17.jar:?]
        at 
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33) 
~[scala-library-2.12.17.jar:?]
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198) 
~[scala-library-2.12.17.jar:?]
        at 
org.apache.linkis.engineconn.launch.EngineConnServer$.main(EngineConnServer.scala:103)
 ~[linkis-engineconn-core-1.8.0.jar:1.8.0]
        at 
org.apache.linkis.engineconn.launch.EngineConnServer.main(EngineConnServer.scala)
 ~[linkis-engineconn-core-1.8.0.jar:1.8.0]
   
   2025-11-17 19:06:38.451 [INFO ] [main                                    ] 
o.a.l.e.c.e.h.HiveInitSQLHook (64) [$anonfun$afterExecutionExecute$1] [JobId-] 
- hadoop engineConn skip execute init_sql
   
   ### Expected behavior
   
   code bug
   
   ### Your environment
   
   <!-- Please describe the linkis version you are using and basic environment 
information -->
   <!-- 请描述您使用的linkis版本和基本环境信息 -->
   - Linkis version used: 1.1.2
   - Environment name and version:
       - cdh-5.14.2
       - hdp-3.1.5
       - hive-2.1.1
       - spark-3.2.1
       - scala-2.12.2
       - jdk 1.8.0_121
       - ....
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [ ] Yes I am willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to