-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Hi all,
I am using direct-Kafka-input-stream in my Spark app. When I use
window(...) function in the chain it will cause the processing pipeline
to stop - when I open the Spark-UI I can see that the streaming batches
are being queued and the pipeline reports to process one of the first
batche
ill confuse the issue, since print() will try to only
use the first partition.
Use foreachRDD { rdd => rdd.foreach(println)
or something comparable
On Tue, Mar 22, 2016 at 10:14 AM, Martin Soch wrote:
Hi all,
I am using direct-Kafka-input-stream in my Spark app. When I use window(...)
funct