unsubscribe

2018-03-14 Thread Geoff Foote
unsubscribe From: Karan Sewani [mailto:karan.sew...@guenstiger.de] Sent: 14 March 2018 06:57 To: users@zeppelin.apache.org Subject: Re: Spark Interpreter error: 'not found: type' Hello Marcus Maybe it has something to do with https://stackoverflow.com/questions/13008792/how-to-import-class-u

Re: Zeppelin - Spark Driver location

2018-03-14 Thread ankit jain
Hi Jhang, Not clear on that - I thought spark-submit was done when we run a paragraph, how does the .sh file come into play? Thanks Ankit On Tue, Mar 13, 2018 at 5:43 PM, Jeff Zhang wrote: > > spark-submit is called in bin/interpreter.sh, I didn't try standalone > cluster mode. It is expected

Re: Zeppelin - Spark Driver location

2018-03-14 Thread ankit jain
Also spark standalone cluster moder should work even before this new release, right? On Wed, Mar 14, 2018 at 8:43 AM, ankit jain wrote: > Hi Jhang, > Not clear on that - I thought spark-submit was done when we run a > paragraph, how does the .sh file come into play? > > Thanks > Ankit > > On Tue

multiple users sharing single Spark context

2018-03-14 Thread Ruslan Dautkhanov
Let's say we have a Spark interpreter set up as " The interpreter will be instantiated *Globally *in *shared *process" When one user is using Spark interpreter, another users that are trying to use the same interpreter, getting PENDING until another user's code completes. Per Spark documentation,

Re: multiple users sharing single Spark context

2018-03-14 Thread Ruslan Dautkhanov
Looked at the code.. the only place Zeppelin handles spark.scheduler.pool is here - https://github.com/apache/zeppelin/blob/d762b5288536201d8a2964891c556efaa1bae867/spark/interpreter/src/main/java/org/apache/zeppelin/spark/SparkSqlInterpreter.java#L103 I don't think it matches Spark documentation

Re: multiple users sharing single Spark context

2018-03-14 Thread ankit jain
We are seeing the same PENDING behavior despite running Spark Interpreter in "Isolated per User" - we expected one SparkContext to be created per user and indeed did see multiple SparkSubmit processes spun up on Zeppelin pod. But why go to PENDING if there are multiple contexts that can be run in

Re: Zeppelin - Spark Driver location

2018-03-14 Thread Jeff Zhang
spark-submit would only run when you run the first paragraph using spark interpreter. After that, paragraph would send code to the spark app to execute. >>> Also spark standalone cluster moder should work even before this new release, right? I didn't verify that, not sure whether other people very

Re: multiple users sharing single Spark context

2018-03-14 Thread Jeff Zhang
Globally shared mode means all the users shared the sparkcontext and also the same spark interpreter. That's why in this mode, code is executed sequentially, concurrency is not allowed here as there may be dependencies between paragraphs. Concurrency can not guaranteed the execution order. For you