Yes, the problem is that the driver program is overriding it. Have you set it 
manually in the driver? Or how did you try setting it in workers? You should 
set it by adding

export SPARK_JAVA_OPTS=“-Dspark.local.dir=whatever”

to conf/spark-env.sh on those workers.

Matei

On Mar 27, 2014, at 9:04 PM, Tsai Li Ming <mailingl...@ltsai.com> wrote:

> Anyone can help?
> 
> How can I configure a different spark.local.dir for each executor?
> 
> 
> On 23 Mar, 2014, at 12:11 am, Tsai Li Ming <mailingl...@ltsai.com> wrote:
> 
>> Hi,
>> 
>> Each of my worker node has its own unique spark.local.dir.
>> 
>> However, when I run spark-shell, the shuffle writes are always written to 
>> /tmp despite being set when the worker node is started.
>> 
>> By specifying the spark.local.dir for the driver program, it seems to 
>> override the executor? Is there a way to properly define it in the worker 
>> node?
>> 
>> Thanks!
> 

Reply via email to