Hi Everybody, I have four kafks topics each for separateoperation(Add,Delete,Update,Merge). so spark also will have four consumed streams,so how we can run my spark job here?
should i run four spark jobs separately? is there any way to bundle all streams into singlejar and run as single Job? Thanks in Advance. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/N-kafka-topics-vs-N-spark-Streaming-tp23408.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org