Hi All,

Please help me set Executor JVM memory size. I am using Spark shell and it
appears that the executors are started with a predefined JVM heap of 512m
as soon as Spark shell starts. How can I change this setting? I tried
setting SPARK_EXECUTOR_MEMORY before launching Spark shell:

export SPARK_EXECUTOR_MEMORY=1g

I also tried several other approaches:

1) setting SPARK_WORKER_MEMORY in conf/spark-env.sh on the worker
2)  passing it as -m argument and running bin/start-slave.sh 1 -m 1g on the
worker

Thank you,
Oleg

Reply via email to