Hi,
I have a spark standalone cluster. On it, I am using 3 workers per node.
So I added SPARK_WORKER_INSTANCES set to 3 in spark-env.sh
The problem is, that when I run spark-shell I get the following warning:
WARN SparkConf:
SPARK_WORKER_INSTANCES was detected (set to '3').
This is deprecated in Spark 1.0+.

Please instead use:
- ./spark-submit with --num-executors to specify the number of executors
- Or set SPARK_EXECUTOR_INSTANCES
- spark.executor.instances to configure the number of instances in the spark 
config.

So how would I start a cluster of 3? SPARK_WORKER_INSTANCES is the only way I 
see to start the standalone cluster and the only way I see to define it is in 
spark-env.sh. The spark submit option, SPARK_EXECUTOR_INSTANCES and 
spark.executor.instances are all related to submitting the job.

Any ideas?
Thanks
                Assaf

Reply via email to