Hi all, I run start-master.sh to start standalone Spark with spark://192.168.1.164:7077. Then, I use this command as below, and it's OK: ./bin/spark-shell --master spark://192.168.1.164:7077
The console print correct message, and Spark context had been initialised correctly. However, when I run app in IntelliJ Idea using spark conf like this: val sparkConf = new SparkConf().setAppName("FromMySql") .setMaster("spark://192.168.1.164:7077") .set("spark.akka.heartbeat.interval", "100") val sc = new SparkContext(sparkConf) val sqlContext = new SQLContext(sc) It can't talk to spark and print these error messages: ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkMaster@192.168.1.164:7077] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. If I changed the conf to local[*], it's ok. After I packaged my app then use spark-submit command, the communication between local and remote actor is OK. It's very strange! Then I debugged it, and the remote actor can be fetched correctly in the tryRegisterAllMasters() method of AppClient: def tryRegisterAllMasters() { for (masterAkkaUrl <- masterAkkaUrls) { logInfo("Connecting to master " + masterAkkaUrl + "...") val actor = context.actorSelection(masterAkkaUrl) actor ! RegisterApplication(appDescription) } } After actor send the RegisterApplication message, it seems like the message is not routed to the remote actor, so registering operation is not finished, then failed. I don't know what is the reason. Who know the answer? Regards, Yi -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Why-association-with-remote-system-has-failed-when-set-master-in-Spark-programmatically-tp22911.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org