Thanks moon, if that works, it seem not necessary to have different profiles for different spark version. We can always build it with the latest spark version.
On Thu, Aug 4, 2016 at 11:26 AM, moon soo Lee <m...@apache.org> wrote: > Hi, > > Could you try remove 'SPARK_HOME' from conf/zeppelin-env.sh and add > 'SPARK_HOME' property (in different version of spark directory) in each > individual spark interpreter setting on GUI? This should work. > > Best, > moon > > On Wed, Aug 3, 2016 at 6:42 PM Jeff Zhang <zjf...@gmail.com> wrote: > >> I build zeppelin with spark-2.0 profile enabled, and it seems I can also >> run spark 1.6 interpreter. But I am not sure whether it is officially >> supported to run different versions of spark interpreter in one zeppelin >> build ? My guess maybe it is not, otherwise we don't need profiles for >> different spark version. If it is not, then I think support multiple >> versions of spark interpreter in one zeppelin build might be useful, >> otherwise user need to setup multiple zeppelin servers for different spark >> versions. >> >> >> >> -- >> Best Regards >> >> Jeff Zhang >> > -- Best Regards Jeff Zhang