Thanks for the reply.
Actually, I don't think excluding spark-hive from spark-submit --packages
is a good idea.
I don't want to recompile spark by assembly for my cluster, every time a
new spark release is out.
I prefer using binary version of spark and then adding some jars for job
execution. e
spark-hive is excluded when using --packages, because it can be included in
the spark-assembly by adding -Phive during mvn package or sbt assembly.
Best,
Burak
On Tue, Jul 7, 2015 at 8:06 AM, Hao Ren wrote:
> I want to add spark-hive as a dependence to submit my job, but it seems
> that
> spark