so stated. It is the
> responsibility of the recipient to ensure that this email is virus free,
> therefore neither Peridale Ltd, its subsidiaries nor their employees accept
> any responsibility.
>
> From: Jimmy Xiang [mailto:jxi...@cloudera.com]
> Sent: 30 November 2015 17:28
> To
of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any
responsibility.
From: Jimmy Xiang [mailto:jxi...@cloudera.com]
Sent: 30 November 2015 17:28
To: user@hive.apache.org
Subject: Re: Hive version with Spar
gt;> HTH,
>>
>> Mich
>>
>> NOTE: The information in this email is proprietary and confidential. This
>> message is for the designated recipient only, if you are not the intended
>> recipient, you should destroy it immediately. Any information in this
>&g
troy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefor
ent to ensure that this email is virus free,
> therefore neither Peridale Ltd, its subsidiaries nor their employees accept
> any responsibility.
>
> From: Sofia [mailto:sofia.panagiot...@taiger.com]
> Sent: 18 November 2015 16:50
> To: user@hive.apache.org
> Subject: Hive v
Sent: 18 November 2015 16:50
To: user@hive.apache.org
Subject: Hive version with Spark
Hello
After various failed tries to use my Hive (1.2.1) with my Spark (Spark 1.4.1
built for Hadoop 2.2.0) I decided to try to build again Spark with Hive.
I would like to know what is the latest Hi
*-Phive is e**nough*
*-Phive will use hive1.2.1 default on Spark1.5.0+*
2015-11-19 4:50 GMT+08:00 Udit Mehta :
> As per this link :
> https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started,
> you need to build Spark without Hive.
>
> On Wed, Nov 18, 2015 at 8:50 AM, Sof
As per this link :
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started,
you need to build Spark without Hive.
On Wed, Nov 18, 2015 at 8:50 AM, Sofia wrote:
> Hello
>
> After various failed tries to use my Hive (1.2.1) with my Spark (Spark
> 1.4.1 built for Hadoop 2.
Hello
After various failed tries to use my Hive (1.2.1) with my Spark (Spark 1.4.1
built for Hadoop 2.2.0) I decided to try to build again Spark with Hive.
I would like to know what is the latest Hive version that can be used to build
Spark at this point.
When downloading Spark 1.5 source and t