Re: Spark Standalone Cluster: Having a master and worker on the same node

2016-07-28 Thread Chanh Le
Hi Jestin, I saw most of setup usually setup along master and slave in a same node. Because I think master doesn't do as much job as slave does and resource is expensive we need to use it. BTW In my setup I setup along master and slave. I have 5 nodes and 3 of which are master and slave running al

Re: Spark Standalone Cluster: Having a master and worker on the same node

2016-07-27 Thread Mich Talebzadeh
Hi Justine. As I understand you are using Spark in standalone mode meaning that you start your master and slaves/worker processes. You can specify the number of works for each node in $SPARK_HOME/conf/spark-env.sh file as below # Options for the daemons used in the standalone deploy mode export