You can use this https://github.com/wurstmeister/kafka-docker to spin up a kafka cluster and then point your sparkstreaming to it to consume from it.
On Fri, Jul 1, 2016 at 1:19 AM, SRK <swethakasire...@gmail.com> wrote: > Hi, > > I need to do integration tests using Spark Streaming. My idea is to spin up > kafka using docker locally and use it to feed the stream to my Streaming > Job. Any suggestions on how to do this would be of great help. > > Thanks, > Swetha > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/How-to-spin-up-Kafka-using-docker-and-use-for-Spark-Streaming-Integration-tests-tp27252.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > -- Cheers!