; make it work.
>
>
>
> Thanks
>
>
>
> Mich
>
>
>
> *From:* Xuefu Zhang [mailto:xzh...@cloudera.com]
> *Sent:* 04 December 2015 17:03
> *To:* user@hive.apache.org
> *Subject:* Re: FW: Getting error when trying to start master node after
> building spar
: Re: FW: Getting error when trying to start master node after building
spark 1.3
My last attempt:
1. Make sure the spark-assembly.jar from your own build doesn't contain hive
classes, using "jar -tf spark-assembly.jar | grep hive" command. Copy it to
Hive's /lib direct
My last attempt:
1. Make sure the spark-assembly.jar from your own build doesn't contain
hive classes, using "jar -tf spark-assembly.jar | grep hive" command. Copy
it to Hive's /lib directory. After this, you can forget everything about
this build.
2. Download prebuilt tarball from Spark download
I sent this one to Spark user group but no response
Hi,
I am trying to make Hive work with Spark.
I have been told that I need to use Spark 1.3 and build it from source code
WITHOUT HIVE libraries.
I have built it as follows:
./make-distribution.sh --name "hadoop2-without-