Hi Eevee,

For the executor, have you tried

a. Passing --conf "spark.executor.extraJavaOptions=-XX" as part of the
spark-submit command line if you want it application specific OR
b. Setting spark.executor.extraJavaOptions in conf/spark-default.conf for
all jobs.


Thanks,
Sonal
Nube Technologies <http://www.nubetech.co>

<http://in.linkedin.com/in/sonalgoyal>



On Thu, Aug 30, 2018 at 5:12 PM, Evelyn Bayes <u5015...@gmail.com> wrote:

> Hey all,
>
> Stuck trying to set a parameter in the spark-env.sh and I’m hoping someone
> here knows how.
>
> I want to set the JVM setting -XX:+ExitOnOutOfMemoryError for both Spark
> executors and Spark workers in a standalone mode.
>
> So far my best guess so far is:
> *Worker*
> SPARK_WORKER_OPTS=“${SPARK_WORKER_OPTS} -Dspark.worker.
> extraJavaOptions=-XX:+ExitOnOutOfMemoryError”
> *Executor*
> SPARK_DAEMON_JAVA_OPTS=“${SPARK_DAEMON_JAVA_OPTS} -Dspark.executor.
> extraJavaOptions=-XX:+ExitOnOutOfMemoryError”
>
> Anyone know the actual way to set this or a good place to learn about how
> this stuff works? I’ve already seen the Spark conf and standalone
> documentation and it doesn’t really make this stuff clear.
>
> Thanks a bunch,
> Eevee.
>

Reply via email to