Re: Building Spark with Custom Hadoop Version

2016-02-05 Thread Steve Loughran
> On 4 Feb 2016, at 23:11, Ted Yu wrote: > > Assuming your change is based on hadoop-2 branch, you can use 'mvn install' > command which would put artifacts under 2.8.0-SNAPSHOT subdir in your local > maven repo. > + generally, unless you want to run all the hadoop tests, set the -DskipTes

Re: Building Spark with Custom Hadoop Version

2016-02-05 Thread Steve Loughran
> On 4 Feb 2016, at 23:11, Ted Yu wrote: > > Assuming your change is based on hadoop-2 branch, you can use 'mvn install' > command which would put artifacts under 2.8.0-SNAPSHOT subdir in your local > maven repo. > > Here is an example: > ~/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.8.0-S

Re: Building Spark with Custom Hadoop Version

2016-02-04 Thread Ted Yu
Assuming your change is based on hadoop-2 branch, you can use 'mvn install' command which would put artifacts under 2.8.0-SNAPSHOT subdir in your local maven repo. Here is an example: ~/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.8.0-SNAPSHOT Then you can use the following command to build Spa

Building Spark with Custom Hadoop Version

2016-02-04 Thread Charles Wright
Hello, I have made some modifications to the YARN source code that I want to test with Spark, how do I do this? I know that I need to include my custom hadoop jar as a dependency but I don't know how to do this as I am not very familiar with maven. Any help is appreciated. Thanks, Charles.