[ https://issues.apache.org/jira/browse/HIVE-16391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16519931#comment-16519931 ]
Saisai Shao commented on HIVE-16391: ------------------------------------ Gently ping [~hagleitn], would you please help to review the current proposed patch and suggest the next step. Thanks a lot. > Publish proper Hive 1.2 jars (without including all dependencies in uber jar) > ----------------------------------------------------------------------------- > > Key: HIVE-16391 > URL: https://issues.apache.org/jira/browse/HIVE-16391 > Project: Hive > Issue Type: Task > Components: Build Infrastructure > Affects Versions: 1.2.2 > Reporter: Reynold Xin > Assignee: Saisai Shao > Priority: Major > Labels: pull-request-available > Fix For: 1.2.3 > > Attachments: HIVE-16391.1.patch, HIVE-16391.2.patch, HIVE-16391.patch > > > Apache Spark currently depends on a forked version of Apache Hive. AFAIK, the > only change in the fork is to work around the issue that Hive publishes only > two sets of jars: one set with no dependency declared, and another with all > the dependencies included in the published uber jar. That is to say, Hive > doesn't publish a set of jars with the proper dependencies declared. > There is general consensus on both sides that we should remove the forked > Hive. > The change in the forked version is recorded here > https://github.com/JoshRosen/hive/tree/release-1.2.1-spark2 > Note that the fork in the past included other fixes but those have all become > unnecessary. -- This message was sent by Atlassian JIRA (v7.6.3#76005)