Hi Rui,

If you are getting a "Connection refused" exception, You can resolve it by
checking

*=> Master is running on the specific host*


   -  *netstat -at | grep 7077*

You will get something similar to:


   - *tcp        0      0 akhldz.master.io:7077
   <http://akhldz.master.io:7077> *:*                         LISTEN *


If that is the case, then from your worker machine do a


   - *host akhldz.master.io <http://akhldz.master.io> *( replace
   akhldz.master.io with your master host. If something goes wrong, then
   add a host entry in your /etc/hosts file)
   - *telnet akhldz.master.io <http://akhldz.master.io> 7077 *( If this is
   not connecting, then your worker wont connect either. )


*=> Adding Host entry in /etc/hosts*


Open /etc/hosts from your worker machine and add the following entry
(example)

*192.168.100.20   akhldz.master.io <http://akhldz.master.io>*


*PS :In the above case Pillis was having two ip addresses having same host
name*

eg:
192.168.100.40  s1.machine.org <http://s1.machine.org:7077/>
192.168.100.41  s1.machine.org <http://s1.machine.org:7077/>


Hope that help, Please do post your stack trace if that doesn't solve your
problem.



On Tue, Feb 25, 2014 at 7:33 PM, Li, Rui <rui...@intel.com> wrote:

>  Hi Pillis,
>
>
>
> I met with the same problem here. Could you share how you solved the issue
> more specifically?
>
> I added an entry in /etc/hosts, but it doesn't help.
>
>
>
> *From:* Pillis W [mailto:pillis.w...@gmail.com]
> *Sent:* Sunday, February 09, 2014 4:49 AM
> *To:* u...@spark.incubator.apache.org
> *Subject:* Re: Akka Connection refused - standalone cluster using
> spark-0.9.0
>
>
>
> I fixed my issue - two IP addresses had the same hostname.
>
> Regards
>
>
>
>
>
>
>
> On Fri, Feb 7, 2014 at 12:59 PM, Soumya Simanta <soumya.sima...@gmail.com>
> wrote:
>
> I see similar logs but only when I try to run a standalone Scala program.
> The whole setup works just fine if I'm using the spark-shell/REPL.
>
>
>
>
>
>
>
> On Fri, Feb 7, 2014 at 3:05 PM, mohankreddy <mre...@beanatomics.com>
> wrote:
>
> Here's more information. I have the master up but when I try to get the
> workers up I am getting the following error.
>
>
> log4j:WARN No appenders could be found for logger
> (akka.event.slf4j.Slf4jLogger).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
>
> 14/02/07 15:01:17 INFO Worker: Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 14/02/07 15:01:17 INFO Worker: Starting Spark worker yyyyyyy:58020 with 16
> cores, 67.0 GB RAM
> 14/02/07 15:01:17 INFO Worker: Spark home: /opt/spark
> 14/02/07 15:01:17 INFO WorkerWebUI: Started Worker web UI at
> http://yyyyyyyyy:8081
> 14/02/07 15:01:17 INFO Worker: Connecting to master spark://xxxxx/:7077...
> 14/02/07 15:01:17 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef:
> Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from
> Actor[akka://sparkWorker/user/Worker#2037095035] to
>
> Actor[akka://sparkWorker/deadLetters] was not delivered. [1] dead letters
> encountered. This logging can be turned off or adjusted with configuration
> settings 'akka.log-dead-letters' and
> 'akka.log-dead-letters-during-shutdown'.
>
> 14/02/07 15:01:37 INFO Worker: Connecting to master spark://xxxxx/:7077...
> 14/02/07 15:01:37 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef:
> Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from
> Actor[akka://sparkWorker/user/Worker#2037095035] to
> Actor[akka://sparkWorker/deadLetters] was not delivered. [2] dead letters
>
> encountered. This logging can be turned off or adjusted with configuration
> settings 'akka.log-dead-letters' and
> 'akka.log-dead-letters-during-shutdown'.
>
> 14/02/07 15:01:57 INFO Worker: Connecting to master spark://xxxx/:7077...
> 14/02/07 15:01:57 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef:
> Message [org.apache.spark.deploy.DeployMessages$RegisterWorker] from
> Actor[akka://sparkWorker/user/Worker#2037095035] to
> Actor[akka://sparkWorker/deadLetters] was not delivered. [3] dead letters
>
> encountered. This logging can be turned off or adjusted with configuration
> settings 'akka.log-dead-letters' and
> 'akka.log-dead-letters-during-shutdown'.
>
> 14/02/07 15:02:17 ERROR Worker: All masters are unresponsive! Giving up.
>
>
>
> PS: I masked the IPs
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Akka-Connection-refused-standalone-cluster-using-spark-0-9-0-tp1297p1311.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
>
>
>



-- 
Thanks
Best Regards

Reply via email to