Hi,

When one builds a project for Spark in this case Spark streaming with SBT,
as usual I add dependencies as follows:

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.1"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" %
"1.6.1"

However when I submit it through spark-submit I need to put the package
containing KafkaUtils the same way I do it in spark-shell

${SPARK_HOME}/bin/spark-submit \
                 --jars
/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar \
.....

Now if I want to distribute this as all-in-one package so that it can be
run from any node, I have been told  that I need to create an uber-jar. I
have not done this before so I assume an uber-jar will be totally self
contained with all the classes etc.

Can someone elaborate on this please?

Thanks

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com

Reply via email to