Re: replacement for SPARK_JAVA_OPTS

2014-08-07 Thread Andrew Or
roblem > >>> in > >>> > my > >>> > >>> case > >>> > >>> >>> seems to be that bin/spark-class uses the SPARK_SUBMIT_OPTS > and > >>> > >>> >>> SPARK_JAVA_OPTS environment variables, but nothing parses > >

Re: replacement for SPARK_JAVA_OPTS

2014-08-07 Thread Patrick Wendell
-defaults.conf before the java process is started. >>> > >>> >>> >>> > >>> >>> Here's an example of the process running when only >>> > >>> spark-defaults.conf is >>> > >&

Re: replacement for SPARK_JAVA_OPTS

2014-08-07 Thread Andrew Or
t;> >>> 514 5189 5182 4 21:05 pts/200:00:22 >> > >>> /usr/local/java/bin/java >> > >>> >>> -cp >> > >>> >>> >> > >>> >> > >> ::/opt/spark/conf:/opt/spark/lib/spark-assemb

Re: replacement for SPARK_JAVA_OPTS

2014-08-07 Thread Andrew Or
:/etc/hadoop/conf-mx > > >>> >>> -XX:MaxPermSize=128m -Djava.library.path= -Xms512m -Xmx512m > > >>> >>> org.apache.spark.deploy.SparkSubmit spark-shell -v --class > > >>> >>> org.apache

Re: replacement for SPARK_JAVA_OPTS

2014-08-07 Thread Gary Malouf
gt;>> Here's an example of it when the command line > --driver-java-options is > >>> >>> used (and thus things work): > >>> >>> > >>> >>> > >>> >>> $ ps -ef | grep spark > >>> >>

Re: replacement for SPARK_JAVA_OPTS

2014-08-07 Thread Marcelo Vanzin
23 >>> >>> >>> >>> 514 5399 5392 80 21:15 pts/200:00:06 >>> /usr/local/java/bin/java >>> >>> -cp >>> >>> >>> ::/opt/spark/conf:/opt/spark/lib/spark-assembly-1.0.1-hadoop2.3.0-mr1-cdh5.0.2.jar:/etc/h

Re: replacement for SPARK_JAVA_OPTS

2014-08-07 Thread Cody Koeninger
>> >>> --driver-java-options -Dfoo.bar.baz=23 --class >> org.apache.spark.repl.Main >> >>> >> >>> >> >>> >> >>> >> >>> On Wed, Jul 30, 2014 at 3:43 PM, Patrick Wendell >> >>> wrote: >> >>> &g

Re: replacement for SPARK_JAVA_OPTS

2014-07-31 Thread Cody Koeninger
t; wrote: > >>> > >>>> Cody - in your example you are using the '=' character, but in our > >>>> documentation and tests we use a whitespace to separate the key and > >>>> value in the defaults file. > >>>

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Patrick Wendell
t;>> >>>> spark.driver.extraJavaOptions -Dfoo.bar.baz=23 >>>> >>>> I'm not sure if the java properties file parser will try to interpret >>>> the equals sign. If so you might need to do this. >>>> >>>> spark.driver.ext

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Patrick Wendell
one already? >>> > >>> > For system properties SparkSubmit should be able to read those >>> > settings and do the right thing, but that obviously won't work for >>> > other JVM options... the current code should work fine in cluster mode >

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Cody Koeninger
previously using SPARK_JAVA_OPTS to set java system properties >> via >> >> -D. >> >> >> >> This was used for properties that varied on a >> per-deployment-environment >> >> basis, but needed to be available in the spark shell and workers. &g

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Cody Koeninger
-deployment-environment > >> basis, but needed to be available in the spark shell and workers. > >> > >> On upgrading to 1.0, we saw that SPARK_JAVA_OPTS had been deprecated, > and > >> replaced by spark-defaults.conf and command line arguments to > spar

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Patrick Wendell
but needed to be available in the spark shell and workers. >> >> On upgrading to 1.0, we saw that SPARK_JAVA_OPTS had been deprecated, and >> replaced by spark-defaults.conf and command line arguments to spark-submit >> or spark-shell. >> >> However, setting s

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Marcelo Vanzin
tor.extraJavaOptions in spark-defaults.conf is not a replacement > for SPARK_JAVA_OPTS: > > > $ cat conf/spark-defaults.conf > spark.driver.extraJavaOptions=-Dfoo.bar.baz=23 > > $ ./bin/spark-shell > > scala> System.getProperty("foo.bar.baz") > res0: S

replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Cody Koeninger
replaced by spark-defaults.conf and command line arguments to spark-submit or spark-shell. However, setting spark.driver.extraJavaOptions and spark.executor.extraJavaOptions in spark-defaults.conf is not a replacement for SPARK_JAVA_OPTS: $ cat conf/spark-defaults.conf spark.driver.extraJavaOptions