I start the spark master with $SPARK_HOME/sbin/start-master.sh, but I use the
following to start the workers:
$SPARK_HOME/bin/spark-class org.apache.spark.deploy.worker.Worker
spark://$MASTER:7077
see my blog for more details, although I need to update the posts based on what
I’ve changed today
How do you start the Spark daemon, directly?
https://issues.apache.org/jira/browse/SPARK-11570
If that's the case solution is to start by script, but I didn't read the
whole thing. In my little world (currently 2-machine cluster soon move to
300) I have the same issue with 1.4.1, and I thought it
In the script / environment which launches your Spark driver, try setting
the SPARK_PUBLIC_DNS environment variable to point to a publicly-accessible
hostname.
See
https://spark.apache.org/docs/latest/configuration.html#environment-variables
for more details. This environment variable also affects