Hi, I need to do integration tests using Spark Streaming. My idea is to spin up kafka using docker locally and use it to feed the stream to my Streaming Job. Any suggestions on how to do this would be of great help.
Thanks, Swetha -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-spin-up-Kafka-using-docker-and-use-for-Spark-Streaming-Integration-tests-tp27252.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org