Hi Jestin,
I saw most of setup usually setup along master and slave in a same node.
Because I think master doesn't do as much job as slave does and resource is
expensive we need to use it.
BTW In my setup I setup along master and slave.
I have 5 nodes and 3 of which are master and slave running al
Hi Justine.
As I understand you are using Spark in standalone mode meaning that you
start your master and slaves/worker processes.
You can specify the number of works for each node in
$SPARK_HOME/conf/spark-env.sh file as below
# Options for the daemons used in the standalone deploy mode
export