Yeah I think someone even just suggested that today in a separate
thread? couldn't hurt to just add an example.
On Fri, Nov 14, 2014 at 4:48 PM, Corey Nolet wrote:
> In the past, I've built it by providing -Dhadoop.version=2.5.1 exactly like
> you've mentioned. What prompted me to write this emai
You're the second person to request this today. Planning to include this in my
PR for Spark-4338.
-Sandy
> On Nov 14, 2014, at 8:48 AM, Corey Nolet wrote:
>
> In the past, I've built it by providing -Dhadoop.version=2.5.1 exactly like
> you've mentioned. What prompted me to write this email wa
In the past, I've built it by providing -Dhadoop.version=2.5.1 exactly like
you've mentioned. What prompted me to write this email was that I did not
see any documentation that told me Hadoop 2.5.1 was officially supported by
Spark (i.e. community has been using it, any bugs are being fixed, etc...
I don't think it's necessary. You're looking at the hadoop-2.4
profile, which works with anything >= 2.4. AFAIK there is no further
specialization needed beyond that. The profile sets hadoop.version to
2.4.0 by default, but this can be overridden.
On Fri, Nov 14, 2014 at 3:43 PM, Corey Nolet wrot
I noticed Spark 1.2.0-SNAPSHOT still has 2.4.x in the pom. Since 2.5.x is
the current stable Hadoop 2.x, would it make sense for us to update the
poms?