I am including the Spark core dependency in my maven pom.xml:

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.0</version>
</dependency>

This is bringing these hadoop versions:
hadoop-annotations-2.2.0
hadoop-auth-2.2.0
hadoop-client-2.2.0
hadoop-common-2.2.0
hadoop-core-0.20.204.0
hadoop-hdfs-2.2.0
followed by mapreduce and yarn dependencies... let me know if you need the
full list.
Thanks,
Ellen


On Mon, Sep 21, 2015 at 1:48 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> What Spark package are you using? In particular, which hadoop version?
>
> On Mon, Sep 21, 2015 at 9:14 AM, ekraffmiller
> <ellen.kraffmil...@gmail.com> wrote:
> > Hi,
> > I’m trying to run a simple test program to access Spark though Java.  I’m
> > using JDK 1.8, and Spark 1.5.  I’m getting an Exception from the
> > JavaSparkContext constructor.  My initialization code matches all the
> sample
> > code I’ve found online, so not sure what I’m doing wrong.
> >
> > Here is my code:
> >
> > SparkConf conf = new SparkConf().setAppName("Simple Application");
> > conf.setMaster("local");
> > conf.setAppName("my app");
> > JavaSparkContext sc = new JavaSparkContext(conf);
> >
> > The stack trace of the Exception:
> >
> > java.lang.ExceptionInInitializerError: null
> >         at java.lang.Class.getField(Class.java:1690)
> >         at
> >
> org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:220)
> >         at
> >
> org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
> >         at
> >
> org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
> >         at
> >
> org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:189)
> >         at
> >
> org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
> >         at
> >
> org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
> >         at
> >
> org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:147)
> >         at
> >
> org.apache.spark.storage.DiskBlockManager.<init>(DiskBlockManager.scala:54)
> >         at
> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:75)
> >         at
> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:173)
> >         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:345)
> >         at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
> >         at
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:276)
> >         at org.apache.spark.SparkContext.<init>(SparkContext.scala:441)
> >         at
> >
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
> >         at
> >
> edu.harvard.iq.text.core.spark.SparkControllerTest.testMongoRDD(SparkControllerTest.java:63)
> >
> > Thanks,
> > Ellen
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Exception-initializing-JavaSparkContext-tp24755.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>
>
> --
> Marcelo
>

Reply via email to