Hello, I have a remote Spark cluster and I'm trying to use it by setting the spark interpreter property:
master spark://spark-cluster-master:7077, however I'm getting the following error: java.lang.RuntimeException: SPARK_HOME is not specified in interpreter-setting for non-local mode, if you specify it in zeppelin-env.sh, please move that into interpreter setting version: Docker Image 0.8.0 Thanks, Alex.