Re: sbt assembly with hive

2014-12-12 Thread Abhi Basu
I am getting the same message when trying to get HIveContext in CDH 5.1 after enabling Spark. I am thinking Spark should come with Hive enabled (default option) as Hive metastore is a common way to share data, due to popularity of Hive and other SQL-Over-Hadoop technologies like Impala. Thanks, A

sbt assembly with hive

2014-12-12 Thread Stephen Boesch
What is the proper way to build with hive from sbt? The SPARK_HIVE is deprecated. However after running the following: sbt -Pyarn -Phadoop-2.3 -Phive assembly/assembly And then bin/pyspark hivectx = HiveContext(sc) hivectx.hiveql("select * from my_table") Exception: ("You must bui