so stated. It is the
> responsibility of the recipient to ensure that this email is virus free,
> therefore neither Peridale Ltd, its subsidiaries nor their employees accept
> any responsibility.
>
> From: Jimmy Xiang [mailto:jxi...@cloudera.com]
> Sent: 30 November 2015 17:28
> To
of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any
responsibility.
From: Jimmy Xiang [mailto:jxi...@cloudera.com]
Sent: 30 November 2015 17:28
To: user@hive.apache.org
Subject: Re: Hive version with Spar
Hi Sofia,
For Hive 1.2.1, you should not use Spark 1.5. There are some incompatible
interface change in Spark 1.5.
Have you tried Hive 1.2.1 with Spark 1.3.1? As Udit pointed out, you can
follow the instruction on
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started
Sofia,
What specific problem did you encounter when trying spark.master other than
local?
Thanks,
Xuefu
On Sat, Nov 28, 2015 at 1:14 AM, Sofia Panagiotidi <
sofia.panagiot...@taiger.com> wrote:
> Hi Mich,
>
>
> I never managed to run Hive on Spark with a spark master other than local
> so I am
Hi Mich,
I never managed to run Hive on Spark with a spark master other than local so I
am afraid I don’t have a reply here.
But do try some things. Firstly, run hive as
hive --hiveconf hive.root.logger=DEBUG,console
so that you are able to see what the exact error is.
I am afraid I cannot b
Hi Sophia,
There is no Hadoop-2.6. I believe you should use Hadoop-2.4 as shown below
mvn -Phadoop-2.4 -Dhadoop.version=2.6.0 -DskipTests clean package
Also if you are building it for Hive on Spark engine, you should not include
Hadoop.jar files in your build.
For example I tr
*-Phive is e**nough*
*-Phive will use hive1.2.1 default on Spark1.5.0+*
2015-11-19 4:50 GMT+08:00 Udit Mehta :
> As per this link :
> https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started,
> you need to build Spark without Hive.
>
> On Wed, Nov 18, 2015 at 8:50 AM, Sof
As per this link :
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started,
you need to build Spark without Hive.
On Wed, Nov 18, 2015 at 8:50 AM, Sofia wrote:
> Hello
>
> After various failed tries to use my Hive (1.2.1) with my Spark (Spark
> 1.4.1 built for Hadoop 2.