Hi all,

I'm building a Spark Streaming application that will continuously read
multiple kafka topics at the same time.  However, I found a weird issue that
it reads only hundreds of messages then it stopped reading any more.  If I
changed the three topic to only one topic, then it is fine and it will
continue to consume.  Below is the code I have.

val consumerThreadsPerInputDstream = 1
val topics = Map("raw_0" -> consumerThreadsPerInputDstream)
                         "raw_1" -> consumerThreadsPerInputDstream,
                         "raw_2" -> consumerThreadsPerInputDstream)

val msgs = KafkaUtils.createStream(ssc, "10.10.10.10:2181/hkafka",
"group01", topics).map(_._2)
.......

How come it will no longer consume after hundreds of messages for three
topic reading?  How to resolve this issue?

Thank you for your help,
Eason



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Not-Reading-Messages-From-Multiple-Kafka-Topics-tp22170.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to