Re: Building Spark to use for Hive on Spark

2015-11-22 Thread Lefty Leverenz
Thanks Xuefu! -- Lefty On Mon, Nov 23, 2015 at 1:09 AM, Xuefu Zhang wrote: > Hive is supposed to work with any version of Hive (1.1+) and a version of > Spark w/o Hive. Thus, to make HoS work reliably and also simply the > matters, I think it still makes to require that spark-assembly jar > sho

Re: Building Spark to use for Hive on Spark

2015-11-22 Thread Xuefu Zhang
Hive is supposed to work with any version of Hive (1.1+) and a version of Spark w/o Hive. Thus, to make HoS work reliably and also simply the matters, I think it still makes to require that spark-assembly jar shouldn't contain Hive Jars. Otherwise, you have to make sure that your Hive version match

Re: Building Spark to use for Hive on Spark

2015-11-22 Thread Lefty Leverenz
Gopal, can you confirm the doc change that Jone Zhang suggests? The second sentence confuses me: "You can choose Spark1.5.0+ which build include the Hive jars." Thanks. -- Lefty On Thu, Nov 19, 2015 at 8:33 PM, Jone Zhang wrote: > I should add that Spark1.5.0+ is used hive1.2.1 default whe

Re: Building Spark to use for Hive on Spark

2015-11-19 Thread Jone Zhang
I should add that Spark1.5.0+ is used hive1.2.1 default when you use -Phive So this page shoule write like below “Note that you must have a version of Spark which does *not* include the Hive jars if you use Spark1.

Re: Building Spark to use for Hive on Spark

2015-11-18 Thread Gopal Vijayaraghavan
> I wanted to know why is it necessary to remove the Hive jars from the >Spark build as mentioned on this Because SparkSQL was originally based on Hive & still uses Hive AST to parse SQL. The org.apache.spark.sql.hive package contains the parser which has hard-references to the hive's internal

Building Spark to use for Hive on Spark

2015-11-18 Thread Udit Mehta
Hi, I am planning to test out the Hive on Spark functionality provided by the newer versions of Hive. I wanted to know why is it necessary to remove the Hive jars from the Spark build as mentioned on this this page.