Hi All,

I'm having some trouble setting the disk spill directory for spark.  The
following approaches set "spark.local.dir" (according to the "Environment"
tab of the web UI) but produce the indicated warnings:

*In spark-env.sh:*
export SPARK_JAVA_OPTS=-Dspark.local.dir=/spark/spill
*Associated warning:*
14/08/14 10:10:39 WARN SparkConf: In Spark 1.0 and later spark.local.dir
will be overridden by the value set by the cluster manager (via
SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
14/08/14 10:10:39 WARN SparkConf:
SPARK_JAVA_OPTS was detected (set to '-Dspark.local.dir=/spark/spill').
This is deprecated in Spark 1.0+.
Please instead use...

*In spark-defaults.conf:*
spark.local.dir                  /spark/spill
*Associated warning:*
14/08/14 10:09:04 WARN SparkConf: In Spark 1.0 and later spark.local.dir
will be overridden by the value set by the cluster manager (via
SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).

The following does not produce any warnings, but also produces no sign of
actually setting "spark.local.dir":

*In spark-env.sh:*
export SPARK_LOCAL_DIRS=/spark/spill

Does anybody know whether SPARK_LOCAL_DIRS actually works as advertised, or
if I am perhaps using it incorrectly?

best,
-Brad

Reply via email to