Thanks for the quick reply. Should I specify it to the Zeppelin process or
the Spark interpreter?

Thanks, Alex.

On Fri, Oct 19, 2018 at 4:53 PM Jeff Zhang <zjf...@gmail.com> wrote:

> You need to specify SPARK_HOME which is where spark installed.
>
>
> Alex Dzhagriev <dzh...@gmail.com>于2018年10月20日周六 上午3:12写道:
>
>> Hello,
>>
>> I have a remote Spark cluster and I'm trying to use it by setting the
>> spark interpreter property:
>>
>> master spark://spark-cluster-master:7077, however I'm getting the
>> following error:
>>
>> java.lang.RuntimeException: SPARK_HOME is not specified in
>> interpreter-setting for non-local mode, if you specify it in
>> zeppelin-env.sh, please move that into interpreter setting
>>
>> version: Docker Image 0.8.0
>>
>> Thanks, Alex.
>>
>

Reply via email to