Github user AhyoungRyu commented on the issue:

    https://github.com/apache/zeppelin/pull/1339
  
    @bzz Yeah I also wanted to get more and more feedbacks for this change 
since it's a huge change as you said. Thanks for asking and i'm willing to 
explain again :)
    
    > ** 1.**  Is the comment above is how it works now? Meaning, does on the 
first run of ./bin/zeppelin-deamon.sh or ./bin/zeppelin.sh a download of Apache 
Spark (100+Mb) happen, without asking a user?
    
    First time, I intended to ask sth like "Do you want to download local 
Spark?" when user starts Zeppelin daemon. But there are lot's of things to 
think about more since this question will be added before Zeppelin server 
start. e.g. [Some ppl are using Zeppelin as a start up 
service](https://github.com/apache/zeppelin/pull/1339#issuecomment-250672904) 
with their script as @jongyoul said. This kind of interactive mode will bother 
their env. 
    So I decided to download this local Spark with `./bin/zeppelin-daemon.sh 
get-spark` or `./bin/zeppelin.sh get-spark`. With `get-spark` option, users 
don't need to be asked and they can choose whether they download this local 
mode Spark or not. Also they can use this local Spark without any configuration 
aka `zero configuration`. But we need to notice them the existence of 
`get-spark` option. That's why I updated documentation pages to let them know. 
    
    > **2.** does this also mean that on CI it will happen on every run of 
SeleniumTests as well?
    This change won't effect to CI build. I added `./bin/download-spark.sh` to 
download Spark only when the user run `./bin/zeppelin-daemon.sh get-spark`.  
    
    > **3.** -Ppyspark disappeared, but I remember it was added because we need 
to re-pack some files from Apache Spark to incorporate them in Zeppelin build 
in order for it to work on a cluster. Is it not the case any more? For Spark 
standalone and YARN, etc
    
    `pyspark` profile only exists in `spark-dependency` (Please see 
[here](https://github.com/apache/zeppelin/blob/master/spark-dependencies/pom.xml#L820)).
 Since `spark-dependencies` won't be existed anymore, `-Ppyspark` needs to be 
removed accordingly I guess. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to