Re: Hive version with Spark

2015-11-30 Thread Sofia
so stated. It is the > responsibility of the recipient to ensure that this email is virus free, > therefore neither Peridale Ltd, its subsidiaries nor their employees accept > any responsibility. > > From: Jimmy Xiang [mailto:jxi...@cloudera.com] > Sent: 30 November 2015 17:28 > To

RE: Hive version with Spark

2015-11-30 Thread Mich Talebzadeh
of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Jimmy Xiang [mailto:jxi...@cloudera.com] Sent: 30 November 2015 17:28 To: user@hive.apache.org Subject: Re: Hive version with Spar

Re: Hive version with Spark

2015-11-30 Thread Jimmy Xiang
gt;> HTH, >> >> Mich >> >> NOTE: The information in this email is proprietary and confidential. This >> message is for the designated recipient only, if you are not the intended >> recipient, you should destroy it immediately. Any information in this >&g

Re: Hive version with Spark

2015-11-29 Thread Xuefu Zhang
troy it immediately. Any information in this > message shall not be understood as given or endorsed by Peridale Technology > Ltd, its subsidiaries or their employees, unless expressly so stated. It is > the responsibility of the recipient to ensure that this email is virus > free, therefor

Re: Hive version with Spark

2015-11-28 Thread Sofia Panagiotidi
ent to ensure that this email is virus free, > therefore neither Peridale Ltd, its subsidiaries nor their employees accept > any responsibility. > > From: Sofia [mailto:sofia.panagiot...@taiger.com] > Sent: 18 November 2015 16:50 > To: user@hive.apache.org > Subject: Hive v

RE: Hive version with Spark

2015-11-26 Thread Mich Talebzadeh
Sent: 18 November 2015 16:50 To: user@hive.apache.org Subject: Hive version with Spark Hello After various failed tries to use my Hive (1.2.1) with my Spark (Spark 1.4.1 built for Hadoop 2.2.0) I decided to try to build again Spark with Hive. I would like to know what is the latest Hi

Re: Hive version with Spark

2015-11-19 Thread Jone Zhang
*-Phive is e**nough* *-Phive will use hive1.2.1 default on Spark1.5.0+* 2015-11-19 4:50 GMT+08:00 Udit Mehta : > As per this link : > https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started, > you need to build Spark without Hive. > > On Wed, Nov 18, 2015 at 8:50 AM, Sof

Re: Hive version with Spark

2015-11-18 Thread Udit Mehta
As per this link : https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started, you need to build Spark without Hive. On Wed, Nov 18, 2015 at 8:50 AM, Sofia wrote: > Hello > > After various failed tries to use my Hive (1.2.1) with my Spark (Spark > 1.4.1 built for Hadoop 2.

Hive version with Spark

2015-11-18 Thread Sofia
Hello After various failed tries to use my Hive (1.2.1) with my Spark (Spark 1.4.1 built for Hadoop 2.2.0) I decided to try to build again Spark with Hive. I would like to know what is the latest Hive version that can be used to build Spark at this point. When downloading Spark 1.5 source and t