Re: All inclusive uber-jar

2016-04-04 Thread Haroon Rasheed
t; >{ > case PathList("META-INF", xs @ _*) => MergeStrategy.discard > case x => MergeStrategy.first >} > } > > So if I want to create an sbt assembly jar file and assuming that the > Scala file is called TestStream_assembly then I just run the shell script &g

Re: All inclusive uber-jar

2016-04-04 Thread Mich Talebzadeh
ksh -A TestStream_assembly -T assembly For simple sbt that will be ./generic.ksh -A TestStream -T sbt and for maven it will be ./generic.ksh -A TestStream -T mvn Cheers, Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www

Fwd: All inclusive uber-jar

2016-04-04 Thread vetal king
-- Forwarded message -- From: vetal king Date: Mon, Apr 4, 2016 at 8:59 PM Subject: Re: All inclusive uber-jar To: Mich Talebzadeh Not sure how to create uber jar using sbt, but this is how you can do it using maven. org.apache.maven.plugins

Re: All inclusive uber-jar

2016-04-04 Thread Marco Mistroni
Hi U can use SBT assembly to create uber jar. U should set spark libraries as 'provided' in ur SBT Hth Marco Ps apologies if by any chances I m telling u something u already know On 4 Apr 2016 2:36 pm, "Mich Talebzadeh" wrote: > Hi, > > > When one builds a project for Spark in this case Spark str

All inclusive uber-jar

2016-04-04 Thread Mich Talebzadeh
Hi, When one builds a project for Spark in this case Spark streaming with SBT, as usual I add dependencies as follows: libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.1" libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.6.1" However when I submit it