As the message (from SparkConf.scala) showed, you shouldn't
use SPARK_WORKER_INSTANCES any more.

FYI

On Mon, Feb 1, 2016 at 2:19 PM, Lin, Hao <hao....@finra.org> wrote:

> Can I still use SPARK_WORKER_INSTANCES in conf/spark-env.sh?  the
> following is what I’ve got after trying to set this parameter and run
> spark-shell
>
>
>
> SPARK_WORKER_INSTANCES was detected (set to '32').
>
> This is deprecated in Spark 1.0+.
>
>
>
> Please instead use:
>
> - ./spark-submit with --num-executors to specify the number of executors
>
> - Or set SPARK_EXECUTOR_INSTANCES
>
> - spark.executor.instances to configure the number of instances in the
> spark config.
> Confidentiality Notice:: This email, including attachments, may include
> non-public, proprietary, confidential or legally privileged information. If
> you are not an intended recipient or an authorized agent of an intended
> recipient, you are hereby notified that any dissemination, distribution or
> copying of the information contained in or transmitted with this e-mail is
> unauthorized and strictly prohibited. If you have received this email in
> error, please notify the sender by replying to this message and permanently
> delete this e-mail, its attachments, and any copies of it immediately. You
> should not retain, copy or use this e-mail or any attachment for any
> purpose, nor disclose all or any part of the contents to any other person.
> Thank you.
>

Reply via email to