By default, the interpreter is shared across notebooks. However, in newer
Zeppelin versions (at least in 0.7) you have the option of setting
interpreter options.

Copying from this url -
https://zeppelin.apache.org/docs/latest/interpreter/spark.html

"
Interpreter setting option
You can choose one of shared, scoped and isolated options wheh you
configure Spark interpreter. Spark interpreter creates separated Scala
compiler per each notebook but share a single SparkContext in scoped mode
(experimental). It creates separated SparkContext per each notebook in
isolated mode."

On Tue, Apr 25, 2017 at 9:29 AM, kant kodali <kanth...@gmail.com> wrote:

> Do I need to do anything to share interpreter across notes or it is just
> the default behavior? if it is a default behavior and say a company of 100
> employees want to use the notebook. Do they all share the same SparkContext
> and Streaming Context by default?
>
> On Tue, Apr 25, 2017 at 8:43 AM, Felix Cheung <felixcheun...@hotmail.com>
> wrote:
>
>> You could have multiple notes sharing interpreter - could you have one
>> note for setup to call streamingcontext.start and another note on a
>> schedule just to run the select sql statement?
>>
>> _____________________________
>> From: kant kodali <kanth...@gmail.com>
>> Sent: Tuesday, April 25, 2017 8:12 AM
>> Subject: Re: How to create a real time dashboards from spark using web
>> socket?
>> To: <users@zeppelin.apache.org>
>>
>>
>>
>> yeah This creates a problem when dealing with spark streaming because we
>> can't call streamingcontext.start() multiple times since it will result in
>> ILLEGAL State Exception. anyways still looking for ways to update this
>> dashboard/UI/Graph through web socket?
>>
>> On Tue, Apr 25, 2017 at 7:03 AM, DuyHai Doan <doanduy...@gmail.com>
>> wrote:
>>
>>> Yes, the scheduler applies to the entire note, not just paragraph
>>> Le 25 avr. 2017 02:39, "kant kodali" <kanth...@gmail.com> a écrit :
>>>
>>>> Also it doesn't look like we can run a scheduler on one paragraph? we
>>>> have to run the scheduler for the entire notebook always?
>>>>
>>>> On Mon, Apr 24, 2017 at 1:06 PM, kant kodali <kanth...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi All,
>>>>>
>>>>> I currently do the following
>>>>>
>>>>> val jsonDStream = getJsonDStream()
>>>>>
>>>>> jsonDStream.foreachRDD{rdd =>
>>>>>     val jsonDF = spark.read.json(rdd)
>>>>>     jsonDF.createOrReplaceTempView("dataframe")
>>>>> }
>>>>> client.startStream()
>>>>>
>>>>> %spark.sql select * from dataframe
>>>>>
>>>>> I can see the data and everytime I click a run button I can see the
>>>>> updates as well however is there anyway to update this dashboard/UI/Graph
>>>>> through web socket? Don't want to do polling.
>>>>>
>>>>> Thanks!
>>>>>
>>>>
>>>>
>>
>>
>>
>


-- 
Shan S. Potti,
737-333-1952
https://www.linkedin.com/in/shanmukhasreenivas

Reply via email to