scussion
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/window-analysis-with-Spark-and-Spark-streaming-tp8806p8860.html
> To unsubscribe from window analysis with Spark and Spark streaming, click
> here.
> NAML
--
-
Thanks for the replies
What is not completely clear to me is how time is managed.
I can create a DStream from file.
But if I set the window property that will be bounded to the application
time, right?
If I got it right, with a receiver I can control the way DStream are
created.
But, how can appl
Hi,
I have a large dataset of text logs files on which I need to implement
"window analysis"
Say, extract per-minute data and do aggregated stats on the last X minutes
I've to implement the windowing analysis with spark.
This is the workflow I'm currently using
- read a file and I create a new RD