[ https://issues.apache.org/jira/browse/HIVE-16391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16503188#comment-16503188 ]
Saisai Shao commented on HIVE-16391: ------------------------------------ Uploaded a new patch [^HIVE-16391.1.patch]to use the solution mentioned by Marcelo. Simply by adding two new maven modules and rename the original "hive-exec" module. One added module is new "hive-exec" which is compliant to existing Hive, another added module "hive-exec-spark" is specifically for Spark. > Publish proper Hive 1.2 jars (without including all dependencies in uber jar) > ----------------------------------------------------------------------------- > > Key: HIVE-16391 > URL: https://issues.apache.org/jira/browse/HIVE-16391 > Project: Hive > Issue Type: Task > Components: Build Infrastructure > Affects Versions: 1.2.2 > Reporter: Reynold Xin > Assignee: Saisai Shao > Priority: Major > Labels: pull-request-available > Fix For: 1.2.3 > > Attachments: HIVE-16391.1.patch, HIVE-16391.patch > > > Apache Spark currently depends on a forked version of Apache Hive. AFAIK, the > only change in the fork is to work around the issue that Hive publishes only > two sets of jars: one set with no dependency declared, and another with all > the dependencies included in the published uber jar. That is to say, Hive > doesn't publish a set of jars with the proper dependencies declared. > There is general consensus on both sides that we should remove the forked > Hive. > The change in the forked version is recorded here > https://github.com/JoshRosen/hive/tree/release-1.2.1-spark2 > Note that the fork in the past included other fixes but those have all become > unnecessary. -- This message was sent by Atlassian JIRA (v7.6.3#76005)