Hi moon,

Yes, that’s exactly the change I was thinking of making.  I’ll submit a PR in 
the next day or so.

Thanks!

Cheers,
Craig

> On Nov 5, 2015, at 9:30 AM, moon soo Lee <m...@apache.org> wrote:
> 
> It's easy to change default value of spark.app.name <http://spark.app.name/> 
> based on environment variable or jvm property. 
> 
> Changing
> https://github.com/apache/incubator-zeppelin/blob/master/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L93
>  
> <https://github.com/apache/incubator-zeppelin/blob/master/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L93>
> Would work. You can see how other setting reads the default value from env or 
> jvm property. eg. L95
> 
> Note that, in this way, default value of interpreter setting is read by 
> environment variable. Once interpreter setting is being made and saved, it'll 
> not changed by environment variable.
> 
> Thanks,
> moon
> 
> On Thu, Nov 5, 2015 at 11:54 PM Craig Ching <craigch...@gmail.com 
> <mailto:craigch...@gmail.com>> wrote:
> So any interest in a pull request?  Seems like a pretty straight-forward 
> change, I’d be happy to do this including creating an env var and documenting 
> it in zeppelin-env.sh.template.  And seems an interesting feature when you’re 
> running more than one zeppelin instance against a shared spark cluster.
> 
> Cheers,
> Craig
> 
>> On Nov 4, 2015, at 3:14 PM, Josef A. Habdank <jahabd...@gmail.com 
>> <mailto:jahabd...@gmail.com>> wrote:
>> 
>> A small comment (as a new Zeppelin user): use S3 to store the Notebooks, and 
>> use some S3 tool to upload/download notebooks. If you need hand with setting 
>> up S3 as Notebook Storage, I can help, as just today I have set it up and 
>> works very well :)
>> 
>> On 4 November 2015 at 22:01, Craig Ching <craigch...@gmail.com 
>> <mailto:craigch...@gmail.com>> wrote:
>> Hi Moon,
>> 
>> Can I set that when I start zeppelin up?  The last thing I want to have to 
>> do is tell my users they need to change this.  I’m trying to introduce new 
>> users to spark and I feel that zeppelin is a great way to do that.  So the 
>> less I have them do the better.
>> 
>> Here’s what I’m doing.  First, I have zeppelin containerized in a docker 
>> container.  This docker container is parameterized with the port and the 
>> spark master.  Then I wrote a little UI.  The user gives me their name (any 
>> unique id really) and I fire up a docker container running zeppelin for them 
>> with their own ports (I find free ports for the web port and the web socket 
>> port and reserve them).  It’s their own little zeppelin environment where 
>> they can create notebooks and upload and download them (I haven’t quite 
>> figured out the upload and download just yet).
>> 
>> Thanks, I appreciate the response!
>> 
>> Cheers,
>> Craig
>> 
>>> On Nov 4, 2015, at 9:49 AM, moon soo Lee <m...@apache.org 
>>> <mailto:m...@apache.org>> wrote:
>>> 
>>> Hi,
>>> 
>>> I think you can change "spark.app.name <http://spark.app.name/>" property 
>>> of your spark interpreter setting in "Interpreter" menu.
>>> 
>>> Best,
>>> moon
>>> 
>>> On Wed, Nov 4, 2015 at 2:12 PM Craig Ching <craigch...@gmail.com 
>>> <mailto:craigch...@gmail.com>> wrote:
>>> Hi all,
>>> 
>>> Just starting to play with zeppelin a bit.  I was wondering if there was a 
>>> way to set spark.app.name <http://spark.app.name/>?  It appears to be 
>>> hard-coded in the source (SparkInterpreter), would a PR be accepted to 
>>> change this?  I want to be able to fire up many zeppelin instances based on 
>>> a user id and have the spark jobs submitted to a cluster with those ids so 
>>> that users can see the status of their jobs in the spark UI.  Thoughts?
>>> 
>>> Cheers,
>>> Craig
>> 
>> 
> 

Reply via email to