Re: problems with build of latest the master

2015-07-16 Thread Steve Loughran
e in Spark. >>> If we already mention it, is it possible to make it part of current >>> dependence, but only for Hadoop profiles 2.4 and up? >>> This will solve a lot of headache to those who use Spark + OpenStack Swift >>> and need every time to manuall

Re: problems with build of latest the master

2015-07-15 Thread Sean Owen
> > > From:Sean Owen > To:Gil Vernik/Haifa/IBM@IBMIL > Cc:Ted Yu , Dev , Josh > Rosen , Steve Loughran > Date: 15/07/2015 21:41 > Subject:Re: problems with build of latest the master > > > >

Re: problems with build of latest the master

2015-07-15 Thread Gil Vernik
. From: Sean Owen To: Gil Vernik/Haifa/IBM@IBMIL Cc: Ted Yu , Dev , Josh Rosen , Steve Loughran Date: 15/07/2015 21:41 Subject:Re: problems with build of latest the master You shouldn't get dependencies you need from Spark, right? you declare direct dependencies. A

Re: problems with build of latest the master

2015-07-15 Thread Sean Owen
rom:Ted Yu > To:Josh Rosen > Cc:Steve Loughran , Gil > Vernik/Haifa/IBM@IBMIL, Dev > Date: 15/07/2015 18:28 > Subject:Re: problems with build of latest the master > > > > > If I understand

Re: problems with build of latest the master

2015-07-15 Thread Gil Vernik
o add dependence of it. From: Ted Yu To: Josh Rosen Cc: Steve Loughran , Gil Vernik/Haifa/IBM@IBMIL, Dev Date: 15/07/2015 18:28 Subject: Re: problems with build of latest the master If I understand correctly, hadoop-openstack is not currently dependence in Spark. On J

Re: problems with build of latest the master

2015-07-15 Thread Ted Yu
If I understand correctly, hadoop-openstack is not currently dependence in Spark. > On Jul 15, 2015, at 8:21 AM, Josh Rosen wrote: > > We may be able to fix this from the Spark side by adding appropriate > exclusions in our Hadoop dependencies, right? If possible, I think that we > should

Re: problems with build of latest the master

2015-07-15 Thread Josh Rosen
We may be able to fix this from the Spark side by adding appropriate exclusions in our Hadoop dependencies, right? If possible, I think that we should do this. On Wed, Jul 15, 2015 at 7:10 AM, Ted Yu wrote: > I attached a patch for HADOOP-12235 > > BTW openstack was not mentioned in the first e

Re: problems with build of latest the master

2015-07-15 Thread Ted Yu
I attached a patch for HADOOP-12235 BTW openstack was not mentioned in the first email from Gil. My email and Gil's second email were sent around the same moment. Cheers On Wed, Jul 15, 2015 at 2:06 AM, Steve Loughran wrote: > > On 14 Jul 2015, at 12:22, Ted Yu wrote: > > Looking at Jenkins

Re: problems with build of latest the master

2015-07-15 Thread Steve Loughran
On 14 Jul 2015, at 12:22, Ted Yu mailto:yuzhih...@gmail.com>> wrote: Looking at Jenkins, master branch compiles. Can you try the following command ? mvn -Phive -Phadoop-2.6 -DskipTests clean package What version of Java are you using ? Ted, Giles has stuck in hadoop-openstack, it's that whic

Re: problems with build of latest the master

2015-07-14 Thread Gil Vernik
-all I guess this is needed for Hadoop version 2.6.0, but perhaps latest Hadoop versions has the same mockito versions as Spark uses. Gil Vernik. From: Gil Vernik/Haifa/IBM@IBMIL To: Dev Date: 14/07/2015 12:23 Subject:problems with build of latest the

Re: problems with build of latest the master

2015-07-14 Thread Ted Yu
Looking at Jenkins, master branch compiles. Can you try the following command ? mvn -Phive -Phadoop-2.6 -DskipTests clean package What version of Java are you using ? Cheers On Tue, Jul 14, 2015 at 2:23 AM, Gil Vernik wrote: > I just did checkout of the master and tried to build it with > >

problems with build of latest the master

2015-07-14 Thread Gil Vernik
I just did checkout of the master and tried to build it with mvn -Dhadoop.version=2.6.0 -DskipTests clean package Got: [ERROR] /Users/gilv/Dev/Spark/spark/core/src/test/java/org/apache/spark/shuffle/unsafe/UnsafeShuffleWriterSuite.java:117: error: cannot find symbol [ERROR] when(shuffleMemo