Hi all,

My Spark(Standalone mode) was running fine till yesterday.But now I am getting  
the following exeception when I am running start-slaves.sh or start-all.sh

slave3: failed to launch org.apache.spark.deploy.worker.Worker:
slave3:   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
slave3:   at java.lang.Thread.run(Thread.java:662)

The log files has the following lines.

14/06/27 11:06:30 INFO SecurityManager: Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
14/06/27 11:06:30 INFO SecurityManager: Changing view acls to: hduser
14/06/27 11:06:30 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(hduser)
14/06/27 11:06:30 INFO Slf4jLogger: Slf4jLogger started
14/06/27 11:06:30 INFO Remoting: Starting remoting
Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to 
bind to: master/192.168.125.174:0
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
...
Caused by: java.net.BindException: Cannot assign requested address
...
I saw the same error reported before and have tried the following solutions.

Set the variable SPARK_LOCAL_IP ,Changed the SPARK_MASTER_PORT to a different 
number..But nothing is working.

When I try to start the worker from the respective machines using the following 
java command,its running without any exception

java -cp 
::/usr/local/spark-1.0.0/conf:/usr/local/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.worker.Worker spark://:master:7077


Somebody please give a solution
 
Thanks & Regards, 
Meethu M

Reply via email to