Hi Polina, I tried first/third approach with zeppelin-0.6.0-bin-netinst.tgz and both seems to work for me.
> 3. restart zeppelin, check interpreter tab. Here is one suspicious part you might missed, did you create new Spark Interpreter after restarting zeppelin? One thing you need to keep in mind is that properties that is defined in interpreter/spark/interpreter-setting.json will take priority over spark-default.conf. To be more specific, you will need to change interpreter/spark/interpreter-setting.json file if you want to change one of below properties: spark.app.name spark.executor.memory spark.cores.max spark.master The properties other than above, you can configure them in spark-default.conf. Regards, Mina On Tue, Aug 9, 2016 at 8:12 AM Polina Marasanova < polina.marasan...@quantium.com.au> wrote: > Hi, > > I think I faced a bug of a Zeppelin 0.6.0. > What happened: I'm not able to overwrite spark interpreter properties from > config, only via GUI. > What I've tried: > > first approach, worked on previous version > 1. export SPARK_CONF_DIR=/zeppelin/conf/spark > 2. add my "spark-defaults.conf" file to this directory > 3. restart zeppelin, check interpreter tab. > doesn't work > > second approach: > 1. add "spark-defaults.conf" to ZEPPELIN_CONF_DIR > 2. pass this directory to starting script /bin/zeppelin.sh --config > ZEPPELIN_CONF_DIR > doesn't work > > third approach: > 1. Create a file interpreter-setting.json with my configs similar to this > one > https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json > 2. add to directory /zeppelin/interpreter/spark > 3. restart zeppelin > according to logs looks like zeppelin actually fetching this file, but > still no changes in GUI > > if i add all properties via GUI it works ok, but it means that users > should do it by themselves which is a bit of a pain > > Have anyone seen something similar to my issue? > > Thank you. > > Regards, > Polina > ________________________________________ > From: ndj...@gmail.com [ndj...@gmail.com] > Sent: Monday, 8 August 2016 3:41 PM > To: users@zeppelin.apache.org > Subject: Re: Config for spark interpreter > > Hi Polina, > > You can just define the SPARK_HOME one the conf/zeppelin-Envoyez.sh and > get rid of any other Spark configuration from that file, otherwise Zeppelin > will just overwrite them. Once this is done, you can define the Spark > default configurations in its config file living in conf/spark.default.conf. > > Cheers, > Ndjido. > > > On 08 Aug 2016, at 07:31, Polina Marasanova < > polina.marasan...@quantium.com.au> wrote: > > > > Hi everyone, > > > > I have a question: in previous versions of Zeppelin all settings for > interpreters were stored in a file called "interpreter.json". It was very > convenient to provide there some default spark settings such as > spark.master, default driver memory and etc. > > What is the best way for version 0.6.0 to provide a bunch of defaults > specific to a project? > > > > Thanks > > Cheers > > Polina Marasanova >