:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-driver-application-can-not-connect-to-Spark-Master-tp13226p13779.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail
Your Master is dead, and your application can't connect to it. Can you
verify whether it was your application that killed the Master (by checking
the Master logs before and after you submit your application)? Try
restarting your master (and workers) through `sbin/stop-all.sh` and
`sbin/start-all.sh
Hi, I'm developing an application with Spark.
My java application trying to creates spark context like
Creating spark context
public SparkContext createSparkContext(){
String execUri = System.getenv("SPARK_EXECUTOR_URI");
String[] jars = SparkILoop.getAddedJars();
SparkCo