Mich,
To start your Spark standalone cluster, you can just download the tarball
from Spark repo site. In other words, you don't need to start your cluster
using your build.
You only need to spark-assembly.jar to Hive's /lib directory and that's it.
I guess you have been confused by this, which I
Hi,
I have seen mails that state that the user has managed to build spark 1.3 to
work with Hive. I tried Spark 1.5.2 but no luck
I downloaded spark source 1.3 source code spark-1.3.0.tar and built it as
follows
./make-distribution.sh --name "hadoop2-without-hive" --tgz
"-Pyarn,hadoop-pr