GitHub user jerryshao opened a pull request:

    https://github.com/apache/hive/pull/364

    HIVE-16391: Add a new classifier for hive-exec to be used by Spark

    This fix adding a new classifier for hive-exec artifact (`core-spark`), 
which is specifically used for Spark. Details in 
[SPARK-20202](https://issues.apache.org/jira/browse/SPARK-20202). 
    
    This is because  original hive-exec packages many transitive dependencies 
into shaded jar without relocation, this makes conflicts in Spark. Spark only 
needs to relocate protobuf and kryo jar. So here propose to add a new 
classifier to generate a new artifact only for Spark.
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/jerryshao/hive 1.2-spark-fix

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/hive/pull/364.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #364
    
----
commit bb27b260d82fa0a77d9fea3c123f2af8f1ea88aa
Author: jerryshao <sshao@...>
Date:   2018-06-05T06:59:37Z

    HIVE-16391: Add a new classifier for hive-exec to be used by Spark

----


---

Reply via email to