[ https://issues.apache.org/jira/browse/HIVE-8835?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Chengxiang Li updated HIVE-8835: -------------------------------- Status: Patch Available (was: Open) > identify dependency scope for Remote Spark Context.[Spark Branch] > ----------------------------------------------------------------- > > Key: HIVE-8835 > URL: https://issues.apache.org/jira/browse/HIVE-8835 > Project: Hive > Issue Type: Sub-task > Components: Spark > Reporter: Chengxiang Li > Assignee: Chengxiang Li > Labels: Spark-M3 > Attachments: HIVE-8835.1-spark.patch > > > While submit job through Remote Spark Context, spark RDD graph generation and > job submit is executed in remote side, so we have to add hive related > dependency into its classpath with spark.driver.extraClassPath. instead of > add all hive/hadoop dependency, we should narrow the scope and identify what > dependency remote spark context required. -- This message was sent by Atlassian JIRA (v6.3.4#6332)