Can you check if this JIRA is relevant?
https://issues.apache.org/jira/browse/SPARK-2608
If not, can you make a new one?
On Thu, Oct 27, 2016 at 10:27 PM, Rodrick Brown
wrote:
> Try setting the values in $SPARK_HOME/conf/spark-defaults.conf
>
> i.e.
>
> $ egrep 'spark.(driver|executor).extra' /
Try setting the values in $SPARK_HOME/conf/spark-defaults.conf
i.e.
$ egrep 'spark.(driver|executor).extra'
/data/orchard/spark-2.0.1/conf/spark-defaults.conf
spark.executor.extraJavaOptions -Duser.timezone=UTC
-Xloggc:garbage-collector.log
spark.driver.extraJavaOptions -Dus
We were using 1.6, but now we are on 2.0.1. Both versions show the same
issue.
I dove deep into the Spark code and have identified that the extra java
options are /not/ added to the process on the executors. At this point, I
believe you have to use spark-defaults.conf to set any values that will b
I'm seeing something very similar in my own Mesos/Spark Cluster.
High level summary: When I use `--deploy-mode cluster`, java properties that
I pass to my driver via `spark.driver.extraJavaOptions` are not available to
the driver. I've confirmed this by inspecting the output of
`System.getPropert