I found the problem - the pom.xml I was using also contained and old
dependency to a mahout library, which was including the old hadoop-core.
Removing that fixed the problem.
Thank you!
On Mon, Sep 21, 2015 at 2:54 PM, Ted Yu wrote:
> bq. hadoop-core-0.20.204.0
>
> How come the above got into pl
bq. hadoop-core-0.20.204.0
How come the above got into play - it was from hadoop-1
On Mon, Sep 21, 2015 at 11:34 AM, Ellen Kraffmiller <
ellen.kraffmil...@gmail.com> wrote:
> I am including the Spark core dependency in my maven pom.xml:
>
>
> org.apache.spark
> spark-core_2.10
> 1.5.0
>
>
> Th
I am including the Spark core dependency in my maven pom.xml:
org.apache.spark
spark-core_2.10
1.5.0
This is bringing these hadoop versions:
hadoop-annotations-2.2.0
hadoop-auth-2.2.0
hadoop-client-2.2.0
hadoop-common-2.2.0
hadoop-core-0.20.204.0
hadoop-hdfs-2.2.0
followed by mapreduce and yarn
What Spark package are you using? In particular, which hadoop version?
On Mon, Sep 21, 2015 at 9:14 AM, ekraffmiller
wrote:
> Hi,
> I’m trying to run a simple test program to access Spark though Java. I’m
> using JDK 1.8, and Spark 1.5. I’m getting an Exception from the
> JavaSparkContext const