Thanks Rahul, let me try that.
On Apr 7, 2014 7:33 PM, "Rahul Singhal" <rahul.sing...@guavus.com> wrote:

>   Hi Sai,
>
>  I recently also ran into this problem on 0.9.1. The problem is that
> spark tries to read yarn's class path but when it finds it be empty does
> not fallback to it's default value. To resolve this, either set
> yarn.application.classpath in yarn-site.xml to its default value or put in
> a bug fix. FYI, this issue seems to be resolved in master branch (1.0).
>
>  Reference:
> https://groups.google.com/forum/#!topic/shark-users/8qFsy9JSt4E
>
> http://hadoop.apache.org/docs/r2.3.0/hadoop-yarn/hadoop-yarn-common/yarn-default.xml
>
>   Thanks,
> Rahul Singhal
>
>   From: Sai Prasanna <ansaiprasa...@gmail.com>
> Reply-To: "user@spark.apache.org" <user@spark.apache.org>
> Date: Monday 7 April 2014 6:56 PM
> To: "user@spark.apache.org" <user@spark.apache.org>
> Subject: Null Pointer Exception in Spark Application with Yarn Client Mode
>
>   Hi All,
>
>  I wanted Spark on Yarn to up and running.
>
>  I did "*SPARK_HADOOP_VERSION=2.3.0 SPARK_YARN=true ./sbt/sbt assembly*"
>
>  Then i ran
> "*SPARK_JAR=./assembly/target/scala-2.9.3/spark-assembly-0.8.1-incubating-hadoop2.3.0.jar
> SPARK_YARN_APP_JAR=examples/target/scala-2.9.3/spark-examples_2.9.3-0.8.1-incubating.jar
> MASTER=yarn-client ./spark-shell*"
>
>  I have SPARK_HOME, YARN_CONF_DIR/HADOOP_CONF_DIR set. Still i get the
> following error.
>
>  Any clues ??
>
>  *ERROR:*
> * ........................................... Using Scala version 2.9.3
> (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51) Initializing
> interpreter... Creating SparkContext... java.lang.NullPointerException at
> scala.collection.mutable.ArrayOps$ofRef.length(ArrayOps.scala:115) at
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> at scala.collection.mutable.ArrayOps.foreach(ArrayOps.scala:38) at
> org.apache.spark.deploy.yarn.Client$.populateHadoopClasspath(Client.scala:489)
> at org.apache.spark.deploy.yarn.Client$.populateClasspath(Client.scala:510)
> at org.apache.spark.deploy.yarn.Client.setupLaunchEnv(Client.scala:327) at
> org.apache.spark.deploy.yarn.Client.runApp(Client.scala:90) at
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:71)
> at
> org.apache.spark.scheduler.cluster.ClusterScheduler.start(ClusterScheduler.scala:119)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:273) at
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:862)
> at <init>(<console>:10) at <init>(<console>:22) at <init>(<console>:24) at
> .<init>(<console>:28) at .<clinit>(<console>) at .<init>(<console>:7) at
> .<clinit>(<console>) at $export(<console>) at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606) at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:629)
> at
> org.apache.spark.repl.SparkIMain$Request$$anonfun$10.apply(SparkIMain.scala:897)
> at scala.tools.nsc.interpreter.Line$$anonfun$1.apply$mcV$sp(Line.scala:43)
> at scala.tools.nsc.io.package$$anon$2.run(package.scala:25) at
> java.lang.Thread.run(Thread.java:744) *
> *.....................................................................*
>
>
>
>
>

Reply via email to