roblem
> >>> in
> >>> > my
> >>> > >>> case
> >>> > >>> >>> seems to be that bin/spark-class uses the SPARK_SUBMIT_OPTS
> and
> >>> > >>> >>> SPARK_JAVA_OPTS environment variables, but nothing parses
> >
-defaults.conf before the java process is started.
>>> > >>> >>>
>>> > >>> >>> Here's an example of the process running when only
>>> > >>> spark-defaults.conf is
>>> > >&
t;> >>> 514 5189 5182 4 21:05 pts/200:00:22
>> > >>> /usr/local/java/bin/java
>> > >>> >>> -cp
>> > >>> >>>
>> > >>>
>> >
>> ::/opt/spark/conf:/opt/spark/lib/spark-assemb
:/etc/hadoop/conf-mx
> > >>> >>> -XX:MaxPermSize=128m -Djava.library.path= -Xms512m -Xmx512m
> > >>> >>> org.apache.spark.deploy.SparkSubmit spark-shell -v --class
> > >>> >>> org.apache
gt;>> Here's an example of it when the command line
> --driver-java-options is
> >>> >>> used (and thus things work):
> >>> >>>
> >>> >>>
> >>> >>> $ ps -ef | grep spark
> >>> >>
23
>>> >>>
>>> >>> 514 5399 5392 80 21:15 pts/200:00:06
>>> /usr/local/java/bin/java
>>> >>> -cp
>>> >>>
>>> ::/opt/spark/conf:/opt/spark/lib/spark-assembly-1.0.1-hadoop2.3.0-mr1-cdh5.0.2.jar:/etc/h
>> >>> --driver-java-options -Dfoo.bar.baz=23 --class
>> org.apache.spark.repl.Main
>> >>>
>> >>>
>> >>>
>> >>>
>> >>> On Wed, Jul 30, 2014 at 3:43 PM, Patrick Wendell
>> >>> wrote:
>> >>>
&g
t; wrote:
> >>>
> >>>> Cody - in your example you are using the '=' character, but in our
> >>>> documentation and tests we use a whitespace to separate the key and
> >>>> value in the defaults file.
> >>>
t;>>
>>>> spark.driver.extraJavaOptions -Dfoo.bar.baz=23
>>>>
>>>> I'm not sure if the java properties file parser will try to interpret
>>>> the equals sign. If so you might need to do this.
>>>>
>>>> spark.driver.ext
one already?
>>> >
>>> > For system properties SparkSubmit should be able to read those
>>> > settings and do the right thing, but that obviously won't work for
>>> > other JVM options... the current code should work fine in cluster mode
>
previously using SPARK_JAVA_OPTS to set java system properties
>> via
>> >> -D.
>> >>
>> >> This was used for properties that varied on a
>> per-deployment-environment
>> >> basis, but needed to be available in the spark shell and workers.
&g
-deployment-environment
> >> basis, but needed to be available in the spark shell and workers.
> >>
> >> On upgrading to 1.0, we saw that SPARK_JAVA_OPTS had been deprecated,
> and
> >> replaced by spark-defaults.conf and command line arguments to
> spar
but needed to be available in the spark shell and workers.
>>
>> On upgrading to 1.0, we saw that SPARK_JAVA_OPTS had been deprecated, and
>> replaced by spark-defaults.conf and command line arguments to spark-submit
>> or spark-shell.
>>
>> However, setting s
tor.extraJavaOptions in spark-defaults.conf is not a replacement
> for SPARK_JAVA_OPTS:
>
>
> $ cat conf/spark-defaults.conf
> spark.driver.extraJavaOptions=-Dfoo.bar.baz=23
>
> $ ./bin/spark-shell
>
> scala> System.getProperty("foo.bar.baz")
> res0: S
replaced by spark-defaults.conf and command line arguments to spark-submit
or spark-shell.
However, setting spark.driver.extraJavaOptions and
spark.executor.extraJavaOptions in spark-defaults.conf is not a replacement
for SPARK_JAVA_OPTS:
$ cat conf/spark-defaults.conf
spark.driver.extraJavaOptions
15 matches
Mail list logo