Hi Patrick,

You can set SPARK_HOME in your interpreter tab actually. I, however, cannot
test this feature well and I'll submit a PR for setting envs as well as
properties from interpreter tab. I think you want to use different version
of Spark with multiple Spark setting. Can you please file a new JIRA issue
for it?

Regards,
Jongyoul

On Mon, Aug 8, 2016 at 5:09 PM, DuyHai Doan <doanduy...@gmail.com> wrote:

> Easier solution: creates different instances of Spark interpreter for each
> use case:
>
> 1) For embedded Spark, just let the master property to local[*]
> 2) For system provided Spark, edit the Spark interpreter settings and
> change the master to some spark://<master_ip>:7077
>
> On Mon, Aug 8, 2016 at 9:52 AM, Patrick Duflot <
> patrick.duf...@iba-group.com> wrote:
>
>> Hello Zeppelin users,
>>
>>
>>
>> I was looking to configure Zeppelin so that it uses embedded Spark for
>> some notebooks but uses system provided Spark for others.
>>
>> However it seems that the SPARK_HOME is a global parameter in
>> zeppelin-env.sh.
>>
>> Is it possible to overwrite this setting at notebook level?
>>
>>
>>
>> Thanks!
>>
>>
>>
>> Patrick
>>
>>
>>
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Reply via email to