---------- Forwarded message ----------
From: vetal king <[email protected]>
Date: Mon, Apr 4, 2016 at 8:59 PM
Subject: Re: All inclusive uber-jar
To: Mich Talebzadeh <[email protected]>
Not sure how to create uber jar using sbt, but this is how you can do it
using maven.
<plugins>
....
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<includes>
* <include>*:*</include>*
</includes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
But instead of creating a uber jar, consider providing maven coordinates
with the help of sark-submit's --packages and --repositories options
On Mon, Apr 4, 2016 at 7:06 PM, Mich Talebzadeh <[email protected]>
wrote:
> Hi,
>
>
> When one builds a project for Spark in this case Spark streaming with SBT,
> as usual I add dependencies as follows:
>
> libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.1"
> libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" %
> "1.6.1"
>
> However when I submit it through spark-submit I need to put the package
> containing KafkaUtils the same way I do it in spark-shell
>
> ${SPARK_HOME}/bin/spark-submit \
> --jars
> /home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar \
> .....
>
> Now if I want to distribute this as all-in-one package so that it can be
> run from any node, I have been told that I need to create an uber-jar. I
> have not done this before so I assume an uber-jar will be totally self
> contained with all the classes etc.
>
> Can someone elaborate on this please?
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn *
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>