g/docs/1.5.2/spark-standalone.html
>
>
>
>
>
> *From:* Ted Yu [mailto:yuzhih...@gmail.com]
> *Sent:* Monday, February 01, 2016 5:45 PM
> *To:* Lin, Hao
> *Cc:* user
> *Subject:* Re: SPARK_WORKER_INSTANCES deprecated
>
>
>
> As the message (from SparkConf.scala)
: Re: SPARK_WORKER_INSTANCES deprecated
As the message (from SparkConf.scala) showed, you shouldn't use
SPARK_WORKER_INSTANCES any more.
FYI
On Mon, Feb 1, 2016 at 2:19 PM, Lin, Hao
mailto:hao@finra.org>> wrote:
Can I still use SPARK_WORKER_INSTANCES in conf/spark-env.sh? the f
As the message (from SparkConf.scala) showed, you shouldn't
use SPARK_WORKER_INSTANCES any more.
FYI
On Mon, Feb 1, 2016 at 2:19 PM, Lin, Hao wrote:
> Can I still use SPARK_WORKER_INSTANCES in conf/spark-env.sh? the
> following is what I’ve got after trying to set this parameter and run
> spar
Can I still use SPARK_WORKER_INSTANCES in conf/spark-env.sh? the following is
what I’ve got after trying to set this parameter and run spark-shell
SPARK_WORKER_INSTANCES was detected (set to '32').
This is deprecated in Spark 1.0+.
Please instead use:
- ./spark-submit with --num-executors to sp