Github user tae-jun commented on the issue:
https://github.com/apache/zeppelin/pull/1339
Tested and worked as expected ð
Fantastic work! But, there are other benefits caused by this change.
Before this change, users couldn't use `SPARK_SUBMIT_OPTIONS` env variable
using embedded Spark. (am i right?)
But now it's possible! I tested with `export
SPARK_SUBMIT_OPTIONS="--driver-memory 4G"` and on Spark UI, I could check It
works.

Therefore, I think it would be better to update `conf/zeppelin-env.sh`.
There is a comment which is:
```sh
## Use provided spark installation ##
## defining SPARK_HOME makes Zeppelin run spark interpreter process using
spark-submit
##
# export SPARK_HOME # (required) When it is
defined, load it instead of Zeppelin embedded Spark libraries
# export SPARK_SUBMIT_OPTIONS # (optional) extra options
to pass to spark submit. eg) "--driver-memory 512M --executor-memory 1G".
# export SPARK_APP_NAME # (optional) The name of
spark application.
## Use embedded spark binaries ##
## without SPARK_HOME defined, Zeppelin still able to run spark interpreter
process using embedded spark binaries.
## however, it is not encouraged when you can define SPARK_HOME
##
```
This should be updated properly. In my opinion, it doesn't need to
encourage use external spark anymore :-)
And, is it possible to use embedded spark without `get-spark`? If not, I
think it should be written on README clearly.
LGTM ð
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---