The -pl option works on maven 3.2.3 as well. You can exclude multiple
modules from the build (and result tar) like this:
-pl '!ignite,!flink,!tajo,!cassandra,!lens,!kylin,!phoenix'.

On 2 September 2015 at 17:51, IT CTO <[email protected]> wrote:

> Also, if you are using maven 3.3 you can add -pl '!flink' or any other
> interpreter you don't need. It will reduce both the nuild time and size.
>
> בתאריך יום ד׳, 2 בספט׳ 2015, 18:48 מאת Alexander Bezzubov <[email protected]
> >:
>
>> Hi,
>>
>> thank you for you interest in Zeppelin project!
>>
>> Yes, by default a build that you did includes a lot of different
>> interpreters like Spark, Flink, Lens, etc so that is why the size is quite
>> substantial.
>>
>> In case you are about to use existing Spark\Hadoop - as of
>> https://issues.apache.org/jira/browse/ZEPPELIN-160 now, there is an
>> option now to build Zeppelin with those dependencies in a provided scope
>> (so they are not included in the final archive).
>> Then you just need to set SPARK_HOME and HADOOP_HOME to be able to use
>> existing Spark\Hadoop.
>>
>> Please, let me know if that helps!
>>
>> On Thu, Sep 3, 2015 at 12:38 AM, MrAsanjar . <[email protected]> wrote:
>>
>>> I build zeppelin with following options as it was documented:
>>> *mvn clean package  -Pspark-1.4 -Dspark.version=1.4.1
>>> -Dhadoop.version=2.4.0 -Phadoop-2.4 -Pyarn -DskipTests -P build-distr*
>>>
>>> However the generated tarfile in
>>> zeppelin-distribution/target/zeppelin-0.6.0-incubating-SNAPSHOT.tar.gz is 
>>> *414
>>> Meg*, is that correct?
>>> I also noticed it does include spark, hadoop, and other tarfiles, do I
>>> need them if I am using existing hadoop & spark client configured and
>>> functioning?
>>>
>>
>>


-- 
Christian Tzolov <http://www.linkedin.com/in/tzolov> | Solution Architect,
EMEA Practice Team | Pivotal <http://pivotal.io/>
[email protected]|+31610285517

Reply via email to