[ https://issues.apache.org/jira/browse/HIVE-12880?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15113455#comment-15113455 ]
Xuefu Zhang commented on HIVE-12880: ------------------------------------ [~sershe], thanks for working on this. Since the jar comes from a Spark installation doesn't work for Hive, I think the script should not bother finding and adding it to the classpath. The alternative way is to copy (or link) the right jar to Hive's /lib directory, which can be done by part of packaging. Therefore, I think a better way is to undo the original JIRA that introduced the logic. > spark-assembly causes Hive class version problems > ------------------------------------------------- > > Key: HIVE-12880 > URL: https://issues.apache.org/jira/browse/HIVE-12880 > Project: Hive > Issue Type: Bug > Reporter: Hui Zheng > Assignee: Sergey Shelukhin > Attachments: HIVE-12880.patch > > > It looks like spark-assembly contains versions of Hive classes (e.g. > HiveConf), and these sometimes (always?) come from older versions of Hive. > We've seen problems where depending on classpath perturbations, NoSuchField > errors may be thrown for recently added ConfVars because the HiveConf class > comes from spark-assembly. > Would making sure spark-assembly comes last in the classpath solve the > problem? > Otherwise, can we depend on something that does not package older Hive > classes? > Currently, HIVE-12179 provides a workaround (in non-Spark use case, at least; > I am assuming this issue can also affect Hive-on-Spark). -- This message was sent by Atlassian JIRA (v6.3.4#6332)