Thanks a lot for clarifying that.
Thanks, Alex
On Oct 20, 2018 7:15 AM, "Jhon Anderson Cardenas Diaz" <
jhonderson2...@gmail.com> wrote:
> Hi, You can specify it in the zeppelin-env.sh, or in the Dockerfile.
>
> Zeppelin will look for that variable first in the interpreter settings,
> and if it
Hi, You can specify it in the zeppelin-env.sh, or in the Dockerfile.
Zeppelin will look for that variable first in the interpreter settings, and
if it does not find it, it will look for it on zeppelin environment
variables; so you can specify it in both sides, but as it does not change
frenquently
Thanks for the quick reply. Should I specify it to the Zeppelin process or
the Spark interpreter?
Thanks, Alex.
On Fri, Oct 19, 2018 at 4:53 PM Jeff Zhang wrote:
> You need to specify SPARK_HOME which is where spark installed.
>
>
> Alex Dzhagriev 于2018年10月20日周六 上午3:12写道:
>
>> Hello,
>>
>> I ha
You need to specify SPARK_HOME which is where spark installed.
Alex Dzhagriev 于2018年10月20日周六 上午3:12写道:
> Hello,
>
> I have a remote Spark cluster and I'm trying to use it by setting the
> spark interpreter property:
>
> master spark://spark-cluster-master:7077, however I'm getting the
> followin
Hello,
I have a remote Spark cluster and I'm trying to use it by setting the spark
interpreter property:
master spark://spark-cluster-master:7077, however I'm getting the following
error:
java.lang.RuntimeException: SPARK_HOME is not specified in
interpreter-setting for non-local mode, if you sp