Yeah I think someone even just suggested that today in a separate
thread? couldn't hurt to just add an example.

On Fri, Nov 14, 2014 at 4:48 PM, Corey Nolet <cjno...@gmail.com> wrote:
> In the past, I've built it by providing -Dhadoop.version=2.5.1 exactly like
> you've mentioned. What prompted me to write this email was that I did not
> see any documentation that told me Hadoop 2.5.1 was officially supported by
> Spark (i.e. community has been using it, any bugs are being fixed, etc...).
> It builds, tests pass, etc... but there could be other implications that I
> have not run into based on my own use of the framework.
>
> If we are saying that the standard procedure is to build with the hadoop-2.4
> profile and override the -Dhadoop.version property, should we provide that
> on the build instructions [1] at least?
>
> [1] http://spark.apache.org/docs/latest/building-with-maven.html
>
> On Fri, Nov 14, 2014 at 10:46 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>> I don't think it's necessary. You're looking at the hadoop-2.4
>> profile, which works with anything >= 2.4. AFAIK there is no further
>> specialization needed beyond that. The profile sets hadoop.version to
>> 2.4.0 by default, but this can be overridden.
>>
>> On Fri, Nov 14, 2014 at 3:43 PM, Corey Nolet <cjno...@gmail.com> wrote:
>> > I noticed Spark 1.2.0-SNAPSHOT still has 2.4.x in the pom. Since 2.5.x
>> > is
>> > the current stable Hadoop 2.x, would it make sense for us to update the
>> > poms?
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to