Look at the trace again. It is a very weird error. The SparkSubmit is running
on client side, but YarnClusterSchedulerBackend is supposed in running in YARN
AM.
I suspect you are running the cluster with yarn-client mode, but in
JavaSparkContext you set "yarn-cluster”. As a result, spark contex
Hi Mate,
When you initialize the JavaSparkContext, you don’t need to specify the mode
“yarn-cluster”. I suspect that is the root cause.
Thanks.
Zhan Zhang
On Feb 25, 2015, at 10:12 AM, gulyasm
mailto:mgulya...@gmail.com>> wrote:
JavaSparkContext.
fs://output");
spark.stop();
}
Thank you for your assistance!
Mate Gulyas
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NullPointerException-in-ApplicationMaster-tp21804.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.