Hi David,
Just package with maven and deploy everthing into one jar. You don´t need to do
it like this… Use Maven for example. And check if your cluster already has
this libraries loaded. If you are using CDH for example you can just import the
classes because they already are in the path fro
I have no problems when submitting the task using spark-submit. The --jars
option with the list of jars required is successful and I see in the output
the jars being added:
16/02/10 11:14:24 INFO spark.SparkContext: Added JAR
file:/usr/lib/spark/extras/lib/spark-streaming-kafka.jar at
http://192.