The contents of spark-env.sh is :
SPARK_MASTER_IP=marvin.spark.ins-01
SPARK_MASTER_PORT=7077
SPARK_MASTER_WEBUI_PORT=8080
SPARK_WORKER_WEBUI_PORT=8081
SPARK_WORKER_INSTANCES=1
SPARK_LOCAL_IP=marvin.spark.ins-01
The contents of etc/hosts is
172.28.161.33 marvin.base.ins-01
Can you paste the contents of your spark-env.sh file? Also would be good to
have a look at the /etc/hosts file. Cannot bind to the given ip address can
be resolved if you put the hostname instead of the ip address. Also make
sure the configuration (conf directory) across your cluster have the same
No.. the ./sbin/start-master.sh --ip option did not work... It is still the
same error
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Unable-to-run-applications-on-spark-in-standalone-cluster-mode-tp14683p14779.html
Sent from the Apache Spark Develop
Hi Rohith,
Do you have multiple interfaces on the machine hosting the master ?
If so, can you try to force to the public interface using:
sbin/start-master.sh --ip xxx.xxx.xxx.xxx
Regards
JB
On 10/19/2015 02:05 PM, Rohith Parameshwara wrote:
Hi all,
I am doing some experime