Hi, You can specify it in the zeppelin-env.sh, or in the Dockerfile.

Zeppelin will look for that variable first in the interpreter settings, and
if it does not find it, it will look for it on zeppelin environment
variables; so you can specify it in both sides, but as it does not change
frenquently it is better on zeppelin environment variable.

El sáb., 20 oct. 2018 a las 0:25, Alex Dzhagriev (<dzh...@gmail.com>)
escribió:

> Thanks for the quick reply. Should I specify it to the Zeppelin process or
> the Spark interpreter?
>
> Thanks, Alex.
>
> On Fri, Oct 19, 2018 at 4:53 PM Jeff Zhang <zjf...@gmail.com> wrote:
>
>> You need to specify SPARK_HOME which is where spark installed.
>>
>>
>> Alex Dzhagriev <dzh...@gmail.com>于2018年10月20日周六 上午3:12写道:
>>
>>> Hello,
>>>
>>> I have a remote Spark cluster and I'm trying to use it by setting the
>>> spark interpreter property:
>>>
>>> master spark://spark-cluster-master:7077, however I'm getting the
>>> following error:
>>>
>>> java.lang.RuntimeException: SPARK_HOME is not specified in
>>> interpreter-setting for non-local mode, if you specify it in
>>> zeppelin-env.sh, please move that into interpreter setting
>>>
>>> version: Docker Image 0.8.0
>>>
>>> Thanks, Alex.
>>>
>>

Reply via email to