admin account in zeppelin

2016-06-21 Thread Chen Song
] /api/credential/** = authc, roles[admin] Thanks for your feedback. -- Chen Song

Re: admin account in zeppelin

2016-06-21 Thread Chen Song
. > > Thanks, > Ben > > On Jun 21, 2016, at 2:44 PM, Chen Song wrote: > > I am new to Zeppelin and have successfully set up LDAP authentication on > zeppelin. > > I also want to restrict write access to interpreters, credentials and > configurations to only admin users.

Re: admin account in zeppelin

2016-06-21 Thread Chen Song
= * > role2 = * > role3 = * > > [urls] > /api/version = anon > /** = authc > > > On Tue, Jun 21, 2016 at 3:25 PM, Benjamin Kim wrote: > >> Chen, >> >> If you don’t mind, how did you integrate LDAP with Zeppelin. As far as I >> know, Shiro

spark interpreter

2016-06-21 Thread Chen Song
Zeppelin provides 3 binding modes for each interpreter. With `scoped` or `shared` Spark interpreter, every user share the same SparkContext. Sorry for the dumb question, how does it differ from Spark via Ivy Server? -- Chen Song

Re: spark interpreter

2016-06-28 Thread Chen Song
e current user information to Livy interpreter. And Livy > interpreter creates different session per user via Livy Server. > > > Hope this helps. > > Thanks, > moon > > > On Tue, Jun 21, 2016 at 6:41 PM Chen Song wrote: > >> Zeppelin provides 3 binding modes

Run zeppelin spark interpreter in kerberos

2016-07-19 Thread Chen Song
I have a question on running Zeppelin Spark interpreter in a Kerberized environment. Spark comes with a runtime conf that allows you to specific the keytab and principal. My questions are: 1. When using Livy, does it rely on the same mechanism when starting Spark 2. Whether to use Livy or not, th

Re: Run zeppelin spark interpreter in kerberos

2016-07-20 Thread Chen Song
o refresh ticket in zeppelin side, it depends on zeppelin > side implementation. > > On Wed, Jul 20, 2016 at 4:55 AM, Chen Song wrote: > >> I have a question on running Zeppelin Spark interpreter in a Kerberized >> environment. >> >> Spark comes with a runtime c

spark.jars option for Zeppelin over Livy

2016-08-02 Thread Chen Song
When using Zeppelin over Livy, how would I set the *spark.jars* option? I try to set it in my spark-defaults.conf and Livy doesn't respect it at all. Other properties in spark-default.conf seem to be picked up propertly. Chen

Re: spark.jars option for Zeppelin over Livy

2016-08-04 Thread Chen Song
ve you tried setting it in the Interpreter menu under Livy? >> >> >> >> >> >> On Tue, Aug 2, 2016 at 11:04 AM -0700, "Chen Song" < >> chen.song...@gmail.com> wrote: >> >> When using Zeppelin over Livy, how would I set the *spark

question on livy session in zeppelin

2016-08-04 Thread Chen Song
Hi When using Zeppelin over Livy, it appears that Livy will delete the session (and terminate the Spark job) after one hour of inactivity. After that, the SparkContext is closed and the user will see an error like 404 Not Found in Zeppelin notebook. >From this point, is the only way to proceed is

Re: question on livy session in zeppelin

2016-08-04 Thread Chen Song
sion timeout by changing > livy.server.session.timeout (takes time in millisecond) > > Filed https://issues.apache.org/jira/browse/ZEPPELIN-1293 to track it. > > Thanks, > Vinay > > > On Thu, Aug 4, 2016 at 12:57 PM, Chen Song wrote: > >> Hi >> >> When u

Re: question on livy session in zeppelin

2016-08-05 Thread Chen Song
. Is there a way to support different custom settings for the same interpreter. For example, one user may want more memory for his notebook than other users. Chen On Thu, Aug 4, 2016 at 10:58 PM Chen Song wrote: > Thanks Vinay. > > Following up on this, in a multi-user environment,

Use user id dynamically in JDBC interpreter

2016-11-02 Thread Chen Song
Hello Is there a way to configure a JDBC interpreter to use the user id logged in instead of a static value? Something like shown below: default.user -> jdbc_user to default.user -> ${user_id} Chen