Just remove SPARK_HOME in spark-env.sh, and instead define them in spark’s
interpreter setting. You can create 2 spark interpreter, one for spark 1.6,
another for spark 2. Only difference between them is the SPARK_HOME you
defined in interpreter setting.



Michaël Van de Borne <michael.vandebo...@gmail.com>于2018年1月23日周二 下午6:42写道:

> Hi list,
>
> I'd like my notebooks to support both spark 1.6 & spark 2.
> I managed to get both versions working fine, with the SPARK_HOME variable
> in zeppelin-env.sh. But just one at a time.
> So I need to change the variable and restart zeppelin when I want to swap
> spark versions.
> Is it possible to somehow map a spark distribution to an interpreter group
> so that I can get both versions supported at the same time?
>
>
> Thank you,
>
> cheers,
>
> m.
>

Reply via email to