]
Sent: Thursday, October 06, 2016 9:07 PM
To: Mendelson, Assaf
Cc: user@spark.apache.org
Subject: Re: spark standalone with multiple workers gives a warning
The slaves should connect to the master using the scripts in sbin...
You can read about it here:
http://spark.apache.org/docs/latest/spark
The slaves should connect to the master using the scripts in sbin...
You can read about it here:
http://spark.apache.org/docs/latest/spark-standalone.html
On Thu, Oct 6, 2016 at 6:46 PM, Mendelson, Assaf
wrote:
> Hi,
>
> I have a spark standalone cluster. On it, I am using 3 workers per node.
>
Hi,
I have a spark standalone cluster. On it, I am using 3 workers per node.
So I added SPARK_WORKER_INSTANCES set to 3 in spark-env.sh
The problem is, that when I run spark-shell I get the following warning:
WARN SparkConf:
SPARK_WORKER_INSTANCES was detected (set to '3').
This is deprecated in Sp