Re: Spark streaming: missing classes when kafka consumer classes

2014-12-12 Thread Mario Pastorelli
Hi, I asked on SO and got an answer about this http://stackoverflow.com/questions/27444512/missing-classes-from-the-assembly-file-created-by-sbt-assembly . Adding fullClasspath in assembly := (fullClasspath in Compile).value at the end of my builld.sbt solved the problem, apparently. Best, M

Re: Spark streaming: missing classes when kafka consumer classes

2014-12-11 Thread Flávio Santos
Hi Mario, Try to include this to your libraryDependencies (in your sbt file): "org.apache.kafka" % "kafka_2.10" % "0.8.0" exclude("javax.jms", "jms") exclude("com.sun.jdmk", "jmxtools") exclude("com.sun.jmx", "jmxri") exclude("org.slf4j", "slf4j-simple") Regards, *--Flávio R.

Re: Spark streaming: missing classes when kafka consumer classes

2014-12-11 Thread Akhil Das
Last time i did an sbt assembly and this is how i added the dependencies. libraryDependencies ++= Seq( ("org.apache.spark" % "spark-streaming_2.10" % "1.1.0" % "provided"). exclude("org.eclipse.jetty.orbit", "javax.transaction"). exclude("org.eclipse.jetty.orbit", "javax.mail"). exc

Re: Spark streaming: missing classes when kafka consumer classes

2014-12-11 Thread Mario Pastorelli
Thanks akhil for the answer. I am using sbt assembly and the build.sbt is in the first email. Do you know why those classes are included in that way? Thanks, Mario On 11.12.2014 14:51, Akhil Das wrote: Yes. You can do/use *sbt assembly* and create a big fat jar with all dependencies bundled

Re: Spark streaming: missing classes when kafka consumer classes

2014-12-11 Thread Akhil Das
Yes. You can do/use *sbt assembly* and create a big fat jar with all dependencies bundled inside it. Thanks Best Regards On Thu, Dec 11, 2014 at 7:10 PM, Mario Pastorelli < mario.pastore...@teralytics.ch> wrote: > In this way it works but it's not portable and the idea of having a fat > jar is

Re: Spark streaming: missing classes when kafka consumer classes

2014-12-11 Thread Mario Pastorelli
In this way it works but it's not portable and the idea of having a fat jar is to avoid exactly this. Is there any system to create a self-contained portable fatJar? On 11.12.2014 13:57, Akhil Das wrote: Add these jars while creating the Context. val sc = new SparkContext(conf) sc.add

Re: Spark streaming: missing classes when kafka consumer classes

2014-12-11 Thread Akhil Das
Add these jars while creating the Context. val sc = new SparkContext(conf) sc.addJar("/home/akhld/.ivy2/cache/org.apache.spark/spark-streaming-kafka_2.10/jars/ *spark-streaming-kafka_2.10-1.1.0.jar*") sc.addJar("/home/akhld/.ivy2/cache/com.101tec/zkclient/jars/ *zkclient-0.3.jar*"

Spark streaming: missing classes when kafka consumer classes

2014-12-11 Thread Mario Pastorelli
Hi, I'm trying to use spark-streaming with kafka but I get a strange error on class that are missing. I would like to ask if my way to build the fat jar is correct or no. My program is val kafkaStream = KafkaUtils.createStream(ssc, zookeeperQuorum, kafkaGroupId, kafkaTopicsWithThreads)