Answers to your questions:

You do not explicitly need to install spark and hadoop before building 
Zeppelin. It can be embedded, and while building Zeppelin, you can specify 
spark and hadoop versions you need. Yes SPARK-HOME can be used to point to the 
external spark and hadoop installation.

To build you may specify any spark and hadoop versions. For example:

mvn clean package -Pspark-1.6 -Phadoop-2.4 -Pyarn -Ppyspark


For Cassandra integration, build using the option: -Pcassandra-spark-xx

/Junaid


On 12 Jan 2016, at 00:45, Victor Coustenoble 
<victor.cousteno...@datastax.com<mailto:victor.cousteno...@datastax.com>> wrote:

Few questions on build options:

- Spark and Hadoop are needed to build Zeppelin for client binaries library, 
right ? and with SPARK_HOME set, another client library version can be used, 
right ?
- If I don't specify any options, are there default Spark and Hadoop versions 
embedded ? I don't find it in pom file
- Possible to check Spark client library version used in a notebook ?
- If I only specify the cassandra-spark option, I get at the same time the 
corresponding Spark client library version ?

Thanks
Victor

Reply via email to